More stories

  • in

    Inflatable robotic hand gives amputees real-time tactile control

    For the more than 5 million people in the world who have undergone an upper-limb amputation, prosthetics have come a long way. Beyond traditional mannequin-like appendages, there is a growing number of commercial neuroprosthetics — highly articulated bionic limbs, engineered to sense a user’s residual muscle signals and robotically mimic their intended motions.
    But this high-tech dexterity comes at a price. Neuroprosthetics can cost tens of thousands of dollars and are built around metal skeletons, with electrical motors that can be heavy and rigid.
    Now engineers at MIT and Shanghai Jiao Tong University have designed a soft, lightweight, and potentially low-cost neuroprosthetic hand. Amputees who tested the artificial limb performed daily activities, such as zipping a suitcase, pouring a carton of juice, and petting a cat, just as well as — and in some cases better than — those with more rigid neuroprosthetics.
    The researchers found the prosthetic, designed with a system for tactile feedback, restored some primitive sensation in a volunteer’s residual limb. The new design is also surprisingly durable, quickly recovering after being struck with a hammer or run over with a car.
    The smart hand is soft and elastic, and weighs about half a pound. Its components total around $500 — a fraction of the weight and material cost associated with more rigid smart limbs.
    “This is not a product yet, but the performance is already similar or superior to existing neuroprosthetics, which we’re excited about,” says Xuanhe Zhao, professor of mechanical engineering and of civil and environmental engineering at MIT. “There’s huge potential to make this soft prosthetic very low cost, for low-income families who have suffered from amputation.”
    Zhao and his colleagues have published their work today in Nature Biomedical Engineering. Co-authors include MIT postdoc Shaoting Lin, along with Guoying Gu, Xiangyang Zhu, and collaborators at Shanghai Jiao Tong University in China. More

  • in

    'Missing jigsaw piece': Engineers make critical advance in quantum computer design

    Quantum engineers from UNSW Sydney have removed a major obstacle that has stood in the way of quantum computers becoming a reality: they discovered a new technique they say will be capable of controlling millions of spin qubits — the basic units of information in a silicon quantum processor.
    Until now, quantum computer engineers and scientists have worked with a proof-of-concept model of quantum processors by demonstrating the control of only a handful of qubits.
    But with their latest research, published today in Science Advances, the team have found what they consider ‘the missing jigsaw piece’ in the quantum computer architecture that should enable the control of the millions of qubits needed for extraordinarily complex calculations.
    Dr Jarryd Pla, a faculty member in UNSW’s School of Electrical Engineering and Telecommunications says his research team wanted to crack the problem that had stumped quantum computer scientists for decades: how to control not just a few, but millions of qubits without taking up valuable space with more wiring, using more electricity, and generating more heat.
    “Up until this point, controlling electron spin qubits relied on us delivering microwave magnetic fields by putting a current through a wire right beside the qubit,” Dr Pla says.
    “This poses some real challenges if we want to scale up to the millions of qubits that a quantum computer will need to solve globally significant problems, such as the design of new vaccines. More

  • in

    Faster path planning for rubble-roving robots

    Robots that need to use their arms to make their way across treacherous terrain just got a speed upgrade with a new path planning approach, developed by University of Michigan researchers.
    The improved algorithm path planning algorithm found successful paths three times as often as standard algorithms, while needing much less processing time.
    A new algorithm speeds up path planning for robots that use arm-like appendages to maintain balance on treacherous terrain such as disaster areas or construction sites, U-M researchers have shown. The improved path planning algorithm found successful paths three times as often as standard algorithms, while needing much less processing time.
    “In a collapsed building or on very rough terrain, a robot won’t always be able to balance itself and move forward with just its feet,” said Dmitry Berenson, associate professor of electrical and computer engineering and core faculty at the Robotics Institute.
    “You need new algorithms to figure out where to put both feet and hands. You need to coordinate all these limbs together to maintain stability, and what that boils down to is a very difficult problem.”
    The research enables robots to determine how difficult the terrain is before calculating a successful path forward, which might include bracing on the wall with one or two hands while taking the next step forward. More

  • in

    Gender, personality influence use of interactive tools online

    People’s personality — such as how extroverted or introverted they are — and their gender can be linked to how they interact online, and whether they prefer interacting with a system rather than with other people.
    In a study, a team of researchers found that people considered websites more interactive if they had tools to facilitate communication between users, often referred to as computer-mediated communication, or CMC. However, male extroverts also considered sites with tools that let them interact with the computer, called human-computer interaction, or HCI, to be more interactive compared to extroverted women, who viewed sites with CMC tools to be more interactive.
    “When you go to a website — for example, the Google search engine — you’re essentially engaging in HCI, which is different from CMC, which is when you’re communicating with other humans through computer technology,” said S. Shyam Sundar, James P. Jimirro Professor of Media Effects in the Donald P. Bellisario College of Communications and co-director of the Media Effects Research Laboratory. “When we talk about HCI here, it’s really about the degree to which the system or the machine allows us to interact with it, and it includes everything from how we swipe and tap on our mobile devices, to how we try to access different information through links on a website. When we talk about CMC, it is about the tools to chat with somebody else, like a customer service agent through an online portal, or when we’re having a video chat via zoom, for example.”
    Knowing who your web visitors are and what engages them is an important part of creating good user experiences, added Sundar, who is also an affiliate of the Institute for Computational and Data Science. “For developers, it’s useful to know who will appreciate what types of interactivity that you have to offer, or what kind of interactivity should you offer to which kind of people.
    “These are actually quite important business decisions, because they cost a lot of money and have a lot of backend consequences,” said Sundar. For example, in an e-commerce site, which may be primarily trafficked by women, the findings suggest that efforts should be made to provide ways to talk to other people, such as chat tools, rather than simply tools to interact with the computer, such as being able to turn an image of a product in all directions.
    Real world behaviors in the virtual world
    When people use websites, many of the habits and behaviors they have adopted in real life influence their behaviors online, said Yan Huang, assistant professor of integrated strategic communication in the Jack J. Valenti School of Communication, University of Houston and first author of the paper. The study is in line with that, she added, demonstrating how people who are extroverted in real life also like to interact in virtual settings. More

  • in

    New algorithm can help improve cellular materials design

    New research published in Scientific Reports has revealed that a simple but robust algorithm can help engineers to improve the design of cellular materials that are used in a variety of diverse applications ranging from defence, bio-medical to smart structures and the aerospace sector.
    The way in which cellular materials will perform can be uncertain and so calculations to help engineers predict how they will react for a particular design, for a given set of loads, conditions and constraints, can help maximise their design and subsequent performance.
    The research collaborators at the Faculty of Science and Engineering, Swansea University, Indian Institute of Technology Delhi and Brown University, USA, found that running specialised calculations can help engineers to find the optimum micro-structure for cellular materials that are used for a wide range of purposes, from advanced aerospace applications to stents used for blocked arteries.
    Research author Dr Tanmoy Chatterjee said: “This paper is the result of one year of sustained collaborative research. The results illustrate that uncertainties in the micro-scale can drastically impact the mechanical performance of metamaterials. Our formulation achieved novel microstructure designs by employing computational algorithms which follow the evolutionary principles of nature.”
    Co-author Professor Sondipon Adhikari explains:
    “This approach allowed us to achieve extreme mechanical properties involving negative Poisson’s ratio (auxetic metamaterial) and elastic modulus. The ability to manipulate extreme mechanical properties through novel optimal micro-architecture designs will open up new possibilities for manufacturing and applications.”
    Story Source:
    Materials provided by Swansea University. Note: Content may be edited for style and length. More

  • in

    Progress in algorithms makes small, noisy quantum computers viable

    As reported in a new article in Nature Reviews Physics, instead of waiting for fully mature quantum computers to emerge, Los Alamos National Laboratory and other leading institutions have developed hybrid classical/quantum algorithms to extract the most performance — and potentially quantum advantage — from today’s noisy, error-prone hardware. Known as variational quantum algorithms, they use the quantum boxes to manipulate quantum systems while shifting much of the work load to classical computers to let them do what they currently do best: solve optimization problems.
    “Quantum computers have the promise to outperform classical computers for certain tasks, but on currently available quantum hardware they can’t run long algorithms. They have too much noise as they interact with environment, which corrupts the information being processed,” said Marco Cerezo, a physicist specializing in quantum computing, quantum machine learning, and quantum information at Los Alamos and a lead author of the paper. “With variational quantum algorithms, we get the best of both worlds. We can harness the power of quantum computers for tasks that classical computers can’t do easily, then use classical computers to complement the computational power of quantum devices.”
    Current noisy, intermediate scale quantum computers have between 50 and 100 qubits, lose their “quantumness” quickly, and lack error correction, which requires more qubits. Since the late 1990s, however, theoreticians have been developing algorithms designed to run on an idealized large, error-correcting, fault tolerant quantum computer.
    “We can’t implement these algorithms yet because they give nonsense results or they require too many qubits. So people realized we needed an approach that adapts to the constraints of the hardware we have — an optimization problem,” said Patrick Coles, a theoretical physicist developing algorithms at Los Alamos and the senior lead author of the paper.
    “We found we could turn all the problems of interest into optimization problems, potentially with quantum advantage, meaning the quantum computer beats a classical computer at the task,” Coles said. Those problems include simulations for material science and quantum chemistry, factoring numbers, big-data analysis, and virtually every application that has been proposed for quantum computers.
    The algorithms are called variational because the optimization process varies the algorithm on the fly, as a kind of machine learning. It changes parameters and logic gates to minimize a cost function, which is a mathematical expression that measures how well the algorithm has performed the task. The problem is solved when the cost function reaches its lowest possible value.
    In an iterative function in the variational quantum algorithm, the quantum computer estimates the cost function, then passes that result back to the classical computer. The classical computer then adjusts the input parameters and sends them to the quantum computer, which runs the optimization again.
    The review article is meant to be a comprehensive introduction and pedagogical reference for researches starting on this nascent field. In it, the authors discuss all the applications for algorithms and how they work, as well as cover challenges, pitfalls, and how to address them. Finally, it looks into the future, considering the best opportunities for achieving quantum advantage on the computers that will be available in the next couple of years.
    Story Source:
    Materials provided by DOE/Los Alamos National Laboratory. Note: Content may be edited for style and length. More

  • in

    Best of both worlds — Combining classical and quantum systems to meet supercomputing demands

    One of the most interesting phenomena in quantum mechanics is “quantum entanglement.” This phenomenon describes how certain particles are inextricably linked, such that their states can only be described with reference to each other. This particle interaction also forms the basis of quantum computing. And this is why, in recent years, physicists have looked for techniques to generate entanglement. However, these techniques confront a number of engineering hurdles, including limitations in creating large number of “qubits” (quantum bits, the basic unit of quantum information), the need to maintain extremely low temperatures (1 K), and the use of ultrapure materials. Surfaces or interfaces are crucial in the formation of quantum entanglement. Unfortunately, electrons confined to surfaces are prone to “decoherence,” a condition in which there is no defined phase relationship between the two distinct states. Thus, to obtain stable, coherent qubits, the spin states of surface atoms (or equivalently, protons) must be determined.
    Recently, a team of scientists in Japan, including Prof. Takahiro Matsumoto from Nagoya City University, Prof. Hidehiko Sugimoto from Chuo University, Dr. Takashi Ohhara from the Japan Atomic Energy Agency, and Dr. Susumu Ikeda from High Energy Accelerator Research Organization, recognized the need for stable qubits. By looking at the surface spin states, the scientists discovered an entangled pair of protons on the surface of a silicon nanocrystal.
    Prof. Matsumoto, the lead scientist, outlines the significance of their study, “Proton entanglement has been previously observed in molecular hydrogen and plays an important role in a variety of scientific disciplines. However, the entangled state was found in gas or liquid phases only. Now, we have detected quantum entanglement on a solid surface, which can lay the groundwork for future quantum technologies.” Their pioneering study was published in a recent issue of Physical Review B.
    The scientists studied the spin states using a technique known as “inelastic neutron scattering spectroscopy” to determine the nature of surface vibrations. By modeling these surface atoms as “harmonic oscillators,” they showed anti-symmetry of protons. Since the protons were identical (or indistinguishable), the oscillator model restricted their possible spin states, resulting in strong entanglement. Compared to the proton entanglement in molecular hydrogen, the entanglement harbored a massive energy difference between its states, ensuring its longevity and stability. Additionally, the scientists theoretically demonstrated a cascade transition of terahertz entangled photon pairs using the proton entanglement.
    The confluence of proton qubits with contemporary silicon technology could result in an organic union of classical and quantum computing platforms, enabling a much larger number of qubits (106) than currently available (102), and ultra-fast processing for new supercomputing applications. “Quantum computers can handle intricate problems, such as integer factorization and the ‘traveling salesman problem,’ which are virtually impossible to solve with traditional supercomputers. This could be a game-changer in quantum computing with regard to storing, processing, and transferring data, potentially even leading to a paradigm shift in pharmaceuticals, data security, and many other areas,” concludes an optimistic Prof. Matsumoto.
    We could be on the verge of witnessing a technological revolution in quantum computing!
    Story Source:
    Materials provided by Nagoya City University. Note: Content may be edited for style and length. More

  • in

    A mobility-based approach to optimize pandemic lockdown strategies

    A new strategy for modeling the spread of COVID-19 incorporates smartphone-captured data on people’s movements and shows promise for aiding development of optimal lockdown policies. Ritabrata Dutta of Warwick University, U.K., and colleagues present these findings in the open-access journal PLOS Computational Biology.
    Evidence shows that lockdowns are effective in mitigating the spread of COVID-19. However, they do come at a high economic cost, and in practice, not everybody follows government guidance on lockdowns. Thus, Dutta and colleagues propose, an optimal lockdown strategy would balance between controlling the ongoing COVID-19 pandemic and minimizing the economic costs of lockdowns.
    To help guide such a strategy, the researchers developed new mathematical models that simulate the spread of COVID-19. The models focus on England and France and — using a statistical approach known as approximate Bayesian computation — they incorporate both public health data and data on changes in people’s movements, as captured by Google via Android devices; this mobility data serves as a measure of the effectiveness of lockdown policies.
    Then, the researchers demonstrated how their models could be applied to design optimal lockdown strategies for England and France using a mathematical technique called optimal control. They showed that it is possible to design effective lockdown protocols that allow partial reopening of workplaces and schools, while taking into account both public health costs and economic costs. The models can be updated in real time, and they can be adapted to any country for which reliable public health and Google mobility data are available.
    “Our work opens the door to a larger integration between epidemiological models and real-world data to, through the use of supercomputers, determine best public policies to mitigate the effects of a pandemic,” Dutta says. “In a not-so-distant future, policy makers may be able to express certain prioritization criteria, and a computational engine, with an extensive use of different datasets, could determine the best course of action.”
    Next, the researchers plan to refine their country-wide models to work at smaller scales; specifically, each of the 348 local district authorities of the U.K.
    The researchers add, “The integration of big data, epidemiological models and supercomputers can help us design an optimal lockdown strategy in real time, while balancing both public health and economic costs.”
    Story Source:
    Materials provided by PLOS. Note: Content may be edited for style and length. More