More stories

  • in

    Artificial visual system of record-low energy consumption for the next generation of AI

    A joint research led by City University of Hong Kong (CityU) has built an ultralow-power consumption artificial visual system to mimic the human brain, which successfully performed data-intensive cognitive tasks. Their experiment results could provide a promising device system for the next generation of artificial intelligence (AI) applications.
    The research team is led by Professor Johnny Chung-yin Ho, Associate Head and Professor of the Department of Materials Science and Engineering (MSE) at CityU. Their findings have been published in the scientific journal Science Advances, titled “Artificial visual system enabled by quasi-two-dimensional electron gases in oxide superlattice nanowires.”
    As the advances in semiconductor technologies used in digital computing are showing signs of stagnation, the neuromorphic (brain-like) computing systems have been regarded as one of the alternatives in future. Scientists have been trying to develop the next generation of advanced AI computers which can be as lightweight, energy-efficient and adaptable as the human brain.
    “Unfortunately, effectively emulating the brain’s neuroplasticity — the ability to change its neural network connections or re-wire itself — in existing artificial synapses through an ultralow-power manner is still challenging,” said Professor Ho.
    Enhancing energy efficiency of artificial synapses
    Artificial synapse is an artificial version of synapse — the gap across which the two neurons pass through electrical signals to communicate with each other in the brain. It is a device that mimics the brain’s efficient neural signal transmission and memory formation process.

    advertisement

    To enhance the energy efficiency of the artificial synapses, Professor Ho’s research team has introduced quasi-two-dimensional electron gases (quasi-2DEGs) into artificial neuromorphic systems for the first time. By utilising oxide superlattice nanowires — a kind of semiconductor with intriguing electrical properties — developed by them, they have designed the quasi-2DEG photonic synaptic devices which have achieved a record-low energy consumption down to sub-femtojoule (0.7fJ) per synaptic event. It means a decrease of 93% energy consumption when compared with synapses in the human brain.
    “Our experiments have demonstrated that the artificial visual system based on our photonic synapses could simultaneously perform light detection, brain-like processing and memory functions in an ultralow-power manner. We believe our findings can provide a promising strategy to build artificial neuromorphic systems for applications in bionic devices, electronic eyes, and multifunctional robotics in future,” said Professor Ho.
    Resembling conductance change in synapses
    He explained that a two-dimensional electron gas occurs when electrons are confined to a two-dimensional interface between two different materials. Since there are no electron-electron interactions and electron-ion interactions, the electrons move freely in the interface.
    Upon exposure to light pulse, a series of reactions between the oxygen molecules from environment absorbed onto the nanowire surface and the free electrons from the two-dimensional electron gases inside the oxide superlattice nanowires were induced. Hence the conductance of the photonic synapses would change. Given the outstanding charge carrier mobility and sensitivity to light stimuli of superlattice nanowires, the change of conductance in the photonic synapses resembles that in biological synapse. Hence the quasi-2DEG photonic synapses can mimic how the neurons in the human brain transmit and memorise signals.

    advertisement

    A combo of photo-detection and memory functions
    “The special properties of the superlattice nanowire materials enable our synapses to have both the photo-detecting and memory functions simultaneously. In a simple word, the nanowire superlattice cores can detect the light stimulus in a high-sensitivity way, and the nanowire shells promote the memory functions. So there is no need to construct additional memory modules for charge storage in an image sensing chip. As a result, our device can save energy,” explained Professor Ho.
    With this quasi-2DEG photonic synapse, they have built an artificial visual system which could accurately and efficiently detect a patterned light stimulus and “memorise” the shape of the stimuli for an hour. “It is just like our brain will remember what we saw for some time,” described Professor Ho.
    He added that the way the team synthesised the photonic synapses and the artificial visual system did not require complex equipment. And the devices could be made on flexible plastics in a scalable and low-cost manner.
    Professor Ho is the corresponding author of the paper. The co-first authors are Meng You and Li Fangzhou, PhD students from MSE at CityU. Other team members include Dr Bu Xiuming, Dr Yip Sen-po, Kang Xiaolin, Wei Renjie, Li Dapan and Wang Fei, who are all from CityU. Other collaborating researchers come from University of Electronic Science and Technology of China, Kyushu University, and University of Tokyo.
    The study received funding support from CityU, the Research Grants Council of Hong Kong SAR, the National Natural Science Foundation of China and the Science, Technology and Innovation Commission of Shenzhen Municipality. More

  • in

    Artificial Chemist 2.0: quantum dot R&D in less than an hour

    A new technology, called Artificial Chemist 2.0, allows users to go from requesting a custom quantum dot to completing the relevant R&D and beginning manufacturing in less than an hour. The tech is completely autonomous, and uses artificial intelligence (AI) and automated robotic systems to perform multi-step chemical synthesis and analysis.
    Quantum dots are colloidal semiconductor nanocrystals, which are used in applications such as LED displays and solar cells.
    “When we rolled out the first version of Artificial Chemist, it was a proof of concept,” says Milad Abolhasani, corresponding author of a paper on the work and an assistant professor of chemical and biomolecular engineering at North Carolina State University. “Artificial Chemist 2.0 is industrially relevant for both R&D and manufacturing.”
    From a user standpoint, the whole process essentially consists of three steps. First, a user tells Artificial Chemist 2.0 the parameters for the desired quantum dots. For example, what color light do you want to produce? The second step is effectively the R&D stage, where Artificial Chemist 2.0 autonomously conducts a series of rapid experiments, allowing it to identify the optimum material and the most efficient means of producing that material. Third, the system switches over to manufacturing the desired amount of the material.
    “Quantum dots can be divided up into different classes,” Abolhasani says. “For example, well-studied II-VI, IV-VI, and III-V materials, or the recently emerging metal halide perovskites, and so on. Basically, each class consists of a range of materials that have similar chemistries.
    “And the first time you set up Artificial Chemist 2.0 to produce quantum dots in any given class, the robot autonomously runs a set of active learning experiments. This is how the brain of the robotic system learns the materials chemistry,” Abolhasani says. “Depending on the class of material, this learning stage can take between one and 10 hours. After that one-time active learning period, Artificial Chemist 2.0 can identify the best possible formulation for producing the desired quantum dots from 20 million possible combinations with multiple manufacturing steps in 40 minutes or less.”
    The researchers note that the R&D process will almost certainly become faster every time people use it, since the AI algorithm that runs the system will learn more — and become more efficient — with every material that it is asked to identify.
    Artificial Chemist 2.0 incorporates two chemical reactors, which operate in a series. The system is designed to be entirely autonomous, and allows users to switch from one material to another without having to shut down the system. Video of how the system works can be found at https://youtu.be/e_DyV-hohLw.
    “In order to do this successfully, we had to engineer a system that leaves no chemical residues in the reactors and allows the AI-guided robotic system to add the right ingredients, at the right time, at any point in the multi-step material production process,” Abolhasani says. “So that’s what we did.
    “We’re excited about what this means for the specialty chemicals industry. It really accelerates R&D to warp speed, but it is also capable of making kilograms per day of high-value, precisely engineered quantum dots. Those are industrially relevant volumes of material.”

    Story Source:
    Materials provided by North Carolina State University. Note: Content may be edited for style and length. More

  • in

    Atom-thin transistor uses half the voltage of common semiconductors, boosts current density

    University at Buffalo researchers are reporting a new, two-dimensional transistor made of graphene and the compound molybdenum disulfide that could help usher in a new era of computing.
    As described in a paper accepted at the 2020 IEEE International Electron Devices Meeting, which is taking place virtually next week, the transistor requires half the voltage of current semiconductors. It also has a current density greater than similar transistors under development.
    This ability to operate with less voltage and handle more current is key to meet the demand for new power-hungry nanoelectronic devices, including quantum computers.
    “New technologies are needed to extend the performance of electronic systems in terms of power, speed, and density. This next-generation transistor can rapidly switch while consuming low amounts of energy,” says the paper’s lead author, Huamin Li, Ph.D., assistant professor of electrical engineering in the UB School of Engineering and Applied Sciences (SEAS).
    The transistor is composed of a single layer of graphene and a single layer of molybdenum disulfide, or MoS2, which is a part of a group of compounds known as transition metals chalcogenides. The graphene and MoS2 are stacked together, and the overall thickness of the device is roughly 1 nanometer — for comparison, a sheet of paper is about 100,000 nanometers.
    While most transistors require 60 millivolts for a decade of change in current, this new device operates at 29 millivolts.
    It’s able to do this because the unique physical properties of graphene keep electrons “cold” as they are injected from the graphene into the MoS2 channel. This process is called Dirac-source injection. The electrons are considered “cold” because they require much less voltage input and, thus, reduced power consumption to operate the transistor.
    An even more important characteristic of the transistor, Li says, is its ability to handle a greater current density compared to conventional transistor technologies based on 2D or 3D channel materials. As described in the study, the transistor can handle 4 microamps per micrometer.
    “The transistor illustrates the enormous potential 2D semiconductors and their ability to usher in energy-efficient nanoelectronic devices. This could ultimately lead to advancements in quantum research and development, and help extend Moore’s Law,” says co-lead author Fei Yao, PhD, assistant professor in the Department of Materials Design and Innovation, a joint program of SEAS and UB’s College of Arts of Sciences.
    The work was supported by the U.S. National Science Foundation, the New York State Energy Research and Development Authority, the New York State Center of Excellence in Materials Informatics at UB, and the Vice President for Research and Economic Development at UB.

    Story Source:
    Materials provided by University at Buffalo. Original written by Cory Nealon. Note: Content may be edited for style and length. More

  • in

    Faster and more efficient information transfer

    Be it with smartphones, laptops, or mainframes: The transmission, processing, and storage of information is currently based on a single class of material — as it was in the early days of computer science about 60 years ago. A new class of magnetic materials, however, could raise information technology to a new level. Antiferromagnetic insulators enable computing speeds that are a thousand times faster than conventional electronics, with significantly less heating. Components could be packed closer together and logic modules could thus become smaller, which has so far been limited due to the increased heating of current components.
    Information transfer at room temperature
    So far, the problem has been that the information transfer in antiferromagnetic insulators only worked at low temperatures. But who wants to put their smartphones in the freezer to be able to use it? Physicists at Johannes Gutenberg University Mainz (JGU) have now been able to eliminate this shortcoming, together with experimentalists from the CNRS/Thales lab, the CEA Grenoble, and the National High Field Laboratory in France as well as theorists from the Center for Quantum Spintronics (QuSpin) at the Norwegian University of Science and Technology. “We were able to transmit and process information in a standard antiferromagnetic insulator at room temperature — and to do so over long enough distances to enable information processing to occur,” said JGU scientist Andrew Ross. The researchers used iron oxide (α-Fe2O3), the main component of rust, as an antiferromagnetic insulator, because iron oxide is widespread and easy to manufacture.
    The transfer of information in magnetic insulators is made possible by excitations of magnetic order known as magnons. These move as waves through magnetic materials, similar to how waves move across the water surface of a pond after a stone has been thrown into it. Previously, it was believed that these waves must have circular polarization in order to efficiently transmit information. In iron oxide, such circular polarization occurs only at low temperatures. However, the international research team was able to transmit magnons over exceptionally long distances even at room temperature. But how did that work? “We realized that in antiferromagnets with a single plane, two magnons with linear polarization can overlap and migrate together. They complement each other to form an approximately circular polarization,” explained Dr. Romain Lebrun, researcher at the joint CNRS/Thales laboratory in Paris who previously worked in Mainz. “The possibility of using iron oxide at room temperature makes it an ideal playground for the development of ultra-fast spintronic devices based on antiferromagnetic insulators.”
    Extremely low attenuation allows for energy-efficient transmission
    An important question in the process of information transfer is how quickly the information is lost when moving through magnetic materials. This can be recorded quantitatively with the value of the magnetic damping. “The iron oxide examined has one of the lowest magnetic attenuations that has ever been reported in magnetic materials,” explained Professor Mathias Kläui from the JGU Institute of Physics. “We anticipate that high magnetic field techniques will show that other antiferromagnetic materials have similarly low attenuation, which is crucial for the development of a new generation of spintronic devices. We are pursuing such low power magnetic technologies in a long-term collaboration with our colleagues at QuSpin in Norway and I am happy to see that another piece of exciting work as come out of this collaboration.”

    Story Source:
    Materials provided by Johannes Gutenberg Universitaet Mainz. Note: Content may be edited for style and length. More

  • in

    A better kind of cybersecurity strategy

    During the opening ceremonies of the 2018 Winter Olympics, held in PyeongChang, South Korea, Russian hackers launched a cyberattack that disrupted television and internet systems at the games. The incident was resolved quickly, but because Russia used North Korean IP addresses for the attack, the source of the disruption was unclear in the event’s immediate aftermath.
    There is a lesson in that attack, and others like it, at a time when hostilities between countries increasingly occur online. In contrast to conventional national security thinking, such skirmishes call for a new strategic outlook, according to a new paper co-authored by an MIT professor.
    The core of the matter involves deterrence and retaliation. In conventional warfare, deterrence usually consists of potential retaliatory military strikes against enemies. But in cybersecurity, this is more complicated. If identifying cyberattackers is difficult, then retaliating too quickly or too often, on the basis of limited information such as the location of certain IP addresses, can be counterproductive. Indeed, it can embolden other countries to launch their own attacks, by leading them to think they will not be blamed.
    “If one country becomes more aggressive, then the equilibrium response is that all countries are going to end up becoming more aggressive,” says Alexander Wolitzky, an MIT economist who specializes in game theory. “If after every cyberattack my first instinct is to retaliate against Russia and China, this gives North Korea and Iran impunity to engage in cyberattacks.”
    But Wolitzky and his colleagues do think there is a viable new approach, involving a more judicious and well-informed use of selective retaliation.
    “Imperfect attribution makes deterrence multilateral,” Wolitzky says. “You have to think about everybody’s incentives together. Focusing your attention on the most likely culprits could be a big mistake.”
    The paper, “Deterrence with Imperfect Attribution,” appears in the latest issue of the American Political Science Review. In addition to Wolitzky, the authors are Sandeep Baliga, the John L. and Helen Kellogg Professor of Managerial Economics and Decision Sciences at Northwestern University’s Kellogg School of Management; and Ethan Bueno de Mesquita, the Sydney Stein Professor and deputy dean of the Harris School of Public Policy at the University of Chicago.

    advertisement

    The study is a joint project, in which Baliga added to the research team by contacting Wolitzky, whose own work applies game theory to a wide variety of situations, including war, international affairs, network behavior, labor relations, and even technology adoption.
    “In some sense this is a canonical kind of question for game theorists to think about,” Wolitzky says, noting that the development of game theory as an intellectual field stems from the study of nuclear deterrence during the Cold War. “We were interested in what’s different about cyberdeterrence, in contrast to conventional or nuclear deterrence. And of course there are a lot of differences, but one thing that we settled on pretty early is this attribution problem.” In their paper, the authors note that, as former U.S. Deputy Secretary of Defense William Lynn once put it, “Whereas a missile comes with a return address, a computer virus generally does not.”
    In some cases, countries are not even aware of major cyberattacks against them; Iran only belatedly realized it had been attacked by the Stuxnet worm over a period of years, damaging centrifuges being used in the country’s nuclear weapons program.
    In the paper, the scholars largely examined scenarios where countries are aware of cyberattacks against them but have imperfect information about the attacks and attackers. After modeling these events extensively, the researchers determined that the multilateral nature of cybersecurity today makes it markedly different than conventional security. There is a much higher chance in multilateral conditions that retaliation can backfire, generating additional attacks from multiple sources.
    “You don’t necessarily want to commit to be more aggressive after every signal,” Wolitzky says.

    advertisement

    What does work, however, is simultaneously improving detection of attacks and gathering more information about the identity of the attackers, so that a country can pinpoint the other nations they could meaningfully retaliate against.
    But even gathering more information to inform strategic decisions is a tricky process, as the scholars show. Detecting more attacks while being unable to identify the attackers does not clarify specific decisions, for instance. And gathering more information but having “too much certainty in attribution” can lead a country straight back into the problem of lashing out against some states, even as others are continuing to plan and commit attacks.
    “The optimal doctrine in this case in some sense will commit you to retaliate more after the clearest signals, the most unambiguous signals,” Wolitzky says. “If you blindly commit yourself more to retaliate after every attack, you increase the risk you’re going to be retaliating after false alarms.”
    Wolitzky points out that the paper’s model can apply to issues beyond cybersecurity. The problem of stopping pollution can have the same dynamics. If, for instance, numerous firms are polluting a river, singling just one out for punishment can embolden the others to continue.
    Still, the authors do hope the paper will generate discussion in the foreign-policy community, with cyberattacks continuing to be a significant source of national security concern.
    “People thought the possibility of failing to detect or attribute a cyberattack mattered, but there hadn’t [necessarily] been a recognition of the multilateral implications of this,” Wolitzky says. “I do think there is interest in thinking about the applications of that.” More

  • in

    Significant step toward quantum advantage

    The team, led by Bristol researcher and Phasecraft co-founder, Dr. Ashley Montanaro, has discovered algorithms and analysis which significantly lessen the quantum hardware capability needed to solve problems which go beyond the realm of classical computing, even supercomputers.
    In the paper, published in Physical Review B, the team demonstrates how optimised quantum algorithms can solve the notorious Fermi-Hubbard model on near-term hardware.
    The Fermi-Hubbard model is of fundamental importance in condensed-matter physics as a model for strongly correlated materials and a route to understanding high-temperature superconductivity.
    Finding the ground state of the Fermi-Hubbard model has been predicted to be one of the first applications of near-term quantum computers, and one that offers a pathway to understanding and developing novel materials.
    Dr. Ashley Montanaro, research lead and cofounder of Phasecraft: “Quantum computing has critically important applications in materials science and other domains. Despite the major quantum hardware advances recently, we may still be several years from having the right software and hardware to solve meaningful problems with quantum computing. Our research focuses on algorithms and software optimisations to maximise the quantum hardware’s capacity, and bring quantum computing closer to reality.
    “Near-term quantum hardware will have limited device and computation size. Phasecraft applied new theoretical ideas and numerical experiments to put together a very comprehensive study on different strategies for solving the Fermi-Hubbard model, zeroing in on strategies that are most likely to have the best results and impact in the near future.

    advertisement

    “The results suggest that optimising over quantum circuits with a gate depth substantially less than a thousand could be sufficient to solve instances of the Fermi-Hubbard model beyond the capacity of a supercomputer. This new research shows significant promise for the capabilities of near-term quantum devices, improving on previous research findings by around a factor of 10.”
    Physical Review B, published by the American Physical Society, is the top specialist journal in condensed-matter physics. The peer-reviewed research paper was also chosen as the Editors’ Suggestion and to appear in Physics magazine.
    Andrew Childs, Professor in the Department of Computer Science and Institute for Advanced Computer Studies at the University of Maryland: “The Fermi-Hubbard model is a major challenge in condensed-matter physics, and the Phasecraft team has made impressive steps in showing how quantum computers could solve it. Their work suggests that surprisingly low-depth circuits could provide useful information about this model, making it more accessible to realistic quantum hardware.”
    Hartmut Neven, Head of Quantum Artificial Intelligence Lab, Google: “Sooner or later, quantum computing is coming. Developing the algorithms and technology to power the first commercial applications of early quantum computing hardware is the toughest challenge facing the field, which few are willing to take on. We are proud to be partners with Phasecraft, a team that are developing advances in quantum software that could shorten that timeframe by years.”
    Phasecraft Founder Dr. Toby Cubitt: “At Phasecraft, our team of leading quantum theorists have been researching and applying quantum theory for decades, leading some of the top global academic teams and research in the field. Today, Ashley and his team have demonstrated ways to get closer to achieving new possibilities that exist just beyond today’s technological bounds.”
    Phasecraft has closed a record seed round for a quantum company in the UK with £3.7m in funding from private-sector VC investors, led by LocalGlobe with Episode1 along with previous investors. Former Songkick founder Ian Hogarth has also joined as board chair for Phasecraft. Phasecraft previously raised a £750,000 pre-seed round led by UCL Technology Fund with Parkwalk Advisors and London Co-investment Fund and has earned several grants facilitated by InnovateUK. Between equity funding and research grants, Phasecraft has raised more than £5.5m.
    Dr Toby Cubitt: “With new funding and support, we are able to continue our pioneering research and industry collaborations to develop the quantum computing industry and find useful applications faster.”

    Story Source:
    Materials provided by University of Bristol. Note: Content may be edited for style and length. More

  • in

    Energy-efficient magnetic RAM: A new building block for spintronic technologies

    Researchers at Pohang University of Science and Technology (POSTECH) and Seoul National University in South Korea have demonstrated a new way to enhance the energy efficiency of a non-volatile magnetic memory device called SOT-MRAM. Published in Advanced Materials, this finding opens up a new window of exciting opportunities for future energy-efficient magnetic memories based on spintronics.
    In modern computers, the random access memory (RAM) is used to store information. The SOT-MRAM (spin-orbit torque magnetic RAM) is one of the leading candidates for the next-generation memory technologies that aim to surpass the performance of various existing RAMs. The SOT-MRAM may operate faster than the fastest existing RAM (SRAM) and maintain information even after the electric energy supply is powered off whereas all fast RAMs existing today lose information as soon as the energy supply is powered off. The present level of the SOT-MRAM technology falls short of being satisfactory, however, due to its high energy demand; it requires large energy supply (or large current) to write information. Lowering the energy demand and enhancing the energy efficiency is an outstanding problem for the SOT-MRAM.
    In the SOT-MRAM, magnetization directions of tiny magnets store information and writing amounts to change the magnetization directions to desired directions. The magnetization direction change is achieved by a special physics phenomenon called SOT that modifies the magnetization direction when a current is applied. To enhance the energy efficiency, soft magnets are ideal material choice for the tiny magnets since their magnetization directions can be easily alterned by a small current. Soft magnets are bad choice for the safe storage of information since their magnetization direction may be altered even when not intended — due to thermal noise or other noise. For this reason, most attempts to build the SOT-MRAM adopt hard magnets, because they magnetize very strongly and their magnetization direction is not easily altered by noise. But this material choice inevitably makes the energy efficiency of the SOT-MRAM poor.
    A joint research team led by Professor Hyun-Woo Lee in the Department of Physics at POSTECH and Professor Je-Geun Park in the Department of Physics at Seoul National University (former associate director of the Center for Correlated Electron Systems within the Institute for Basic Science in Korea), demonstrated a way to enhance the energy efficiency without sacrificing the demand for safe storage. They reported that ultrathin iron germanium telluride (Fe3GeTe2, FGT) — a ferromagnetic material with special geometrical symmetry and quantum properties — switches from a hard magnet to a soft magnet when a small current is applied. Thus when information writing is not intended, the material remains a hard magnet, which is good for the safe storage, and only when writing is intended, the material switches to a soft magnet, allowing for enhanced energy efficiency.
    “Intriguing properties of layered materials never cease to amaze me: the current through FGT induces a highly unusual type of spin-orbit torque (SOT), which modifies the energy profile of this material to switch it from a hard magnet to a soft magnet. This is in clear contrast to SOT produced by other materials, which may change the magnetization direction but cannot switch a hard magnet to a soft magnet,” explains Professor Lee.
    Experiments by Professor Park’s group revealed that this FGT-based magnetic memory device is highly energy-efficient. In particular, the measured magnitude of SOT per applied current density is two orders of magnitude larger than the values reported previously for other candidate materials for the SOT-MRAM.
    “Controlling magnetic states with a small current is essential for the next-generation of energy-efficient devices. These will be able to store greater amounts of data and enable faster data access than today’s electronic memories, while consuming less energy,” notes Dr. Kaixuan Zhang who is a team leader in Professor Park’s group, interested in studying the application of correlated quantum physics in spintronic devices.
    “Our findings open up a fascinating avenue of electrical modulation and spintronic applications using 2D layered magnetic materials,” closed Professor Lee.

    Story Source:
    Materials provided by Pohang University of Science & Technology (POSTECH). Note: Content may be edited for style and length. More

  • in

    'Electronic amoeba' finds approximate solution to traveling salesman problem in linear time

    Researchers at Hokkaido University and Amoeba Energy in Japan have, inspired by the efficient foraging behavior of a single-celled amoeba, developed an analog computer for finding a reliable and swift solution to the traveling salesman problem — a representative combinatorial optimization problem.
    Many real-world application tasks such as planning and scheduling in logistics and automation are mathematically formulated as combinatorial optimization problems. Conventional digital computers, including supercomputers, are inadequate to solve these complex problems in practically permissible time as the number of candidate solutions they need to evaluate increases exponentially with the problem size — also known as combinatorial explosion. Thus new computers called “Ising machines,” including “quantum annealers,” have been actively developed in recent years. These machines, however, require complicated pre-processing to convert each task to the form they can handle and have a risk of presenting illegal solutions that do not meet some constraints and requests, resulting in major obstacles to the practical applications.
    These obstacles can be avoided using the newly developed “electronic amoeba,” an analog computer inspired by a single-celled amoeboid organism. The amoeba is known to maximize nutrient acquisition efficiently by deforming its body. It has shown to find an approximate solution to the traveling salesman problem (TSP), i.e., given a map of a certain number of cities, the problem is to find the shortest route for visiting each city exactly once and returning to the starting city. This finding inspired Professor Seiya Kasai at Hokkaido University to mimic the dynamics of the amoeba electronically using an analog circuit, as described in the journal Scientific Reports. “The amoeba core searches for a solution under the electronic environment where resistance values at intersections of crossbars represent constraints and requests of the TSP,” says Kasai. Using the crossbars, the city layout can be easily altered by updating the resistance values without complicated pre-processing.
    Kenta Saito, a PhD student in Kasai’s lab, fabricated the circuit on a breadboard and succeeded in finding the shortest route for the 4-city TSP. He evaluated the performance for larger-sized problems using a circuit simulator. Then the circuit reliably found a high-quality legal solution with a significantly shorter route length than the average length obtained by the random sampling. Moreover, the time required to find a high-quality legal solution grew only linearly to the numbers of cities. Comparing the search time with a representative TSP algorithm “2-opt,” the electronic amoeba becomes more advantageous as the number of cities increases. “The analog circuit reproduces well the unique and efficient optimization capability of the amoeba, which the organism has acquired through natural selection,” says Kasai.
    “As the analog computer consists of a simple and compact circuit, it can tackle many real-world problems in which inputs, constraints, and requests dynamically change and can be embedded into IoT devices as a power-saving microchip,” says Masashi Aono who leads Amoeba Energy to promote the practical use of the amoeba-inspired computers.
    This is a Joint Release between Hokkaido University and Amoeba Energy Co., Ltd.

    Story Source:
    Materials provided by Hokkaido University. Note: Content may be edited for style and length. More