More stories

  • in

    Multi-spin flips and a pathway to efficient ising machines

    Combinatorial optimization problems are at the root of many industrial processes and solving them is key to a more sustainable and efficient future. Ising machines can solve certain combinatorial optimization problems, but their efficiency could be improved with multi-spin flips. Researchers have now tackled this difficult problem by developing a merge algorithm that disguises a multi-spin flip as a simpler, single-spin flip. This technology provides optimal solutions to hard computational problems in a shorter time.
    In a rapidly developing world, industries are always trying to optimize their operations and resources. Combinatorial optimization using an Ising machine helps solve certain operational problems, like mapping the most efficient route for a multi-city tour or optimizing delivery of resources. Ising machines operate by mapping the solution space to a spin configuration space and solving the associated spin problem instead. These machines have a wide range of applications in both academia and industry, tackling problems in machine learning, material design, portfolio optimization, logistics, and drug discovery. For larger problems, however, it is still difficult to obtain the optimal solution in a feasible amount of time.
    Now, while Ising machines can be optimized by integrating multi-spin flips into their hardware, this is a challenging task because it essentially means completely overhauling the software of traditional Ising machines by changing their basic operation. But a team of researchers from the Department of Computer Science and Communications Engineering, Waseda University — consisting of Assistant Professor Tatsuhiko Shirai and Professor Nozomu Togawa — has provided a novel solution to this long-standing problem.
    In their paper, which was published in IEEE Transactions on Computerson 27 May 2022, they engineered a feasible multi-spin flip algorithm by deforming the Hamiltonian (which is an energy function of the Ising model). “We have developed a hybrid algorithm that takes an infeasible multi-spin flip and expresses it in the form of a feasible single-spin flip instead. This algorithm is proposed along with our merge process, in which the original Hamiltonian of a difficult combinatorial problem is deformed into a new Hamiltonian, a problem that the hardware of a traditional Ising machine can easily solve,” explains Tatsuhiko Shirai.
    The newly-developed hybrid Ising processes are fully compatible with current methods and hardware, reducing the challenges to their widespread application. “We applied the hybrid merge process to several common examples of difficult combinatorial optimization problems. Our algorithm shows superior performance in all instances. It reduces residual energy and reaches more optimal results in shorter time — it really is a win-win,” states Nozomu Togawa.
    Their work will allow industries to solve new complex optimization problems and help tackle climate change-related issues such as increased energy demand, food shortage, and the realization of sustainable development goals (SDGs). “For example, we could use this to optimize shipping and delivery planning problems in industries to increase their efficiency while reducing carbon dioxide emissions,” Tatsuhiko Shirai adds.
    This new technology directly increases the number of applications where the Ising machine can be feasibly used to produce solutions. As a result, the Ising machine method can be increasingly used across machine learning and optimization science. The team’s technology not only improves the performance of existing Ising machines, but also provides a blueprint to the development of new Ising machine architectures in the near future. With the merge algorithm driving Ising machines further into new uncharted territories, the future of optimization, and thus sustainability practices, looks bright.
    Story Source:
    Materials provided by Waseda University. Note: Content may be edited for style and length. More

  • in

    Algorithms help to distinguish diseases at the molecular level

    In today’s medicine, doctors define and diagnose most diseases on the basis of symptoms. However, that does not necessarily mean that the illnesses of patients with similar symptoms will have identical causes or demonstrate the same molecular changes. In biomedicine, one often speaks of the molecular mechanisms of a disease. This refers to changes in the regulation of genes, proteins or metabolic pathways at the onset of illness. The goal of stratified medicine is to classify patients into various subtypes at the molecular level in order to provide more targeted treatments.
    To extract disease subtypes from large pools of patient data, new machine learning algorithms can help. They are designed to independently recognize patterns and correlations in extensive clinical measurements. The LipiTUM junior research group, headed by Dr. Josch Konstantin Pauling of the Chair for Experimental Bioinformatics has developed an algorithm for this purpose.
    Complex analysis via automated web tool
    Their method combines the results of existing algorithms to obtain more precise and robust predictions of clinical subtypes. This unifies the characteristics and advantages of each algorithm and eliminates their time-consuming adjustment. “This makes it much easier to apply the analysis in clinical research,” reports Dr. Pauling. “For that reason, we have developed a web-based tool that permits online analysis of molecular clinical data by practitioners without prior knowledge of bioinformatics.”
    On the website (https://exbio.wzw.tum.de/mosbi/), researchers can submit their data for automated analysis and use the results to interpret their studies. “Another important aspect for us was the visualization of the results. Previous approaches were not capable of generating intuitive visualizations of relationships between patient groups, clinical factors and molecular signatures. This will change with the web-based visualization produced by our MoSBi tool,” says Tim Rose, a scientist at the TUM School of Life Sciences. MoSBi stands for “Molecular Signatures using Biclustering.” “Biclustering” is the name of the technology used by the algorithm.
    Application for clinically relevant questions
    With the tool, researchers can now, for example, represent data from cancer studies and simulations for various scenarios. They have already demonstrated the potential of their method in a large-scale clinical study. In a cooperative study conducted with researchers from the Max Planck Institute in Dresden, the Technical University of Dresden and the Kiel University Clinic, they studied the change in lipid metabolism in the liver of patients with non-alcoholic fatty liver disease (NAFLD).
    This widespread disease is associated with obesity and diabetes. It develops from the non-alcoholic fatty liver (NAFL), in which lipids are deposited in liver cells, to non-alcoholic steatohepatitis (NASH), in which the liver becomes further inflamed, to liver cirrhosis and the formation of tumors. Apart from dietary adjustments, no treatments have been found to date. Because the disease is characterized and diagnosed by the accumulation of various lipids in the liver, it is important to understand their molecular composition.
    Biomarkers for liver disease
    Using the MoSBi methods, the researchers were able to demonstrate the heterogeneity of the livers of patients in the NAFL stage at the molecular level. “From a molecular standpoint, the liver cells of many NAFL patients were almost identical to those of NASH patients, while others were still largely similar to healthy patients. We could also confirm our predictions using clinical data,” says Dr. Pauling. “We were then able to identify two potential lipid biomarkers for disease progression.” This is important for early recognition of the disease and its progression and the development of targeted treatments.
    The research group is already working on further applications of their method to gain a better understanding of other diseases. “In the future algorithms will play an even greater role in biomedical research than they already do today. They can make it significantly easier to detect complex mechanisms and find more targeted treatment approaches,” says Dr. Pauling.
    Story Source:
    Materials provided by Technical University of Munich (TUM). Note: Content may be edited for style and length. More

  • in

    A quarter of the world's Internet users rely on infrastructure that is susceptible to attacks

    About a quarter of the world’s Internet users live in countries that are more susceptible than previously thought to targeted attacks on their Internet infrastructure. Many of the at-risk countries are located in the Global South.
    That’s the conclusion of a sweeping, large-scale study conducted by computer scientists at the University of California San Diego. The researchers surveyed 75 countries.
    “We wanted to study the topology of the Internet to find weak links that, if compromised, would expose an entire nation’s traffic,” said Alexander Gamero-Garrido, the paper’s first author, who earned his Ph.D. in computer science at UC San Diego.
    Researchers presented their findings at the Passive and Active Measurement Conference 2022 online this spring.
    The structure of the Internet can differ dramatically in different parts of the world. In many developed countries, like the United States, a large number of Internet providers compete to provide services for a large number of users. These networks are directly connected to one another and exchange content, a process known as direct peering. All the providers can also plug directly into the world’s Internet infrastructure.
    “But a large portion of the Internet doesn’t function with peering agreements for network connectivity,” Gamero-Garrido pointed out. More

  • in

    AI learns coral reef 'song'

    Artificial Intelligence (AI) can track the health of coral reefs by learning the “song of the reef,” new research shows.
    Coral reefs have a complex soundscape — and even experts have to conduct painstaking analysis to measure reef health based on sound recordings.
    In the new study, University of Exeter scientists trained a computer algorithm using multiple recordings of healthy and degraded reefs, allowing the machine to learn the difference.
    The computer then analysed a host of new recordings, and successfully identified reef health 92% of the time.
    The team used this to track the progress of reef restoration projects.
    “Coral reefs are facing multiple threats including climate change, so monitoring their health and the success of conservation projects is vital,” said lead author Ben Williams. More

  • in

    Agriculture tech use opens possibility of digital havoc

    Wide-ranging use of smart technologies is raising global agricultural production but international researchers warn this digital-age phenomenon could reap a crop of another kind — cybersecurity attacks.
    Complex IT and math modelling at King Abdulaziz University in Saudi Arabia, Aix-Marseille University, France and Flinders University in South Australia, has highlighted the risks in a new article in the open access journal Sensors.
    “Smart sensors and systems are used to monitor crops, plants, the environment, water, soil moisture, and diseases,” says lead author Professor Abel Alahmadi from King Abdulaziz University.
    “The transformation to digital agriculture would improve the quality and quantity of food for the ever-increasing human population, which is forecast to reach 10.9 billion by 2100.”
    This progress in production, genetic modification for drought-resistant crops, and other technologies is prone to cyber-attack — particularly if the ag-tech sector doesn’t take adequate precautions like other corporate or defence sectors, researchers warn.
    Flinders University researcher Dr Saeed Rehman says the rise of internet connectivity and smart low-power devices has facilitated the shift of many labour-intensive food production jobs into the digital domain — including modern techniques for accurate irrigation, soil and crop monitoring using drone surveillance.
    “However, we should not overlook security threats and vulnerabilities to digital agriculture, in particular possible side-channel attacks specific to ag-tech applications,” says Dr Rehman, an expert in cybersecurity and networking.
    “Digital agriculture is not immune to cyber-attack, as seen by interference to a US watering system, a meatpacking firm, wool broker software and an Australian beverage company.”
    “Extraction of cryptographic or sensitive information from the operation of physical hardware is termed side-channel attack,” adds Flinders co-author Professor David Glynn.
    “These attacks could be easily carried out with physical access to devices, which the cybersecurity community has not explicitly investigated.”
    The researchers recommend investment into precautions and awareness about the vulnerabilities of digital agriculture to cyber-attack, with an eye on the potential serious effects on the general population in terms of food supply, labour and flow-on costs.
    Story Source:
    Materials provided by Flinders University. Note: Content may be edited for style and length. More

  • in

    Tiny robotic crab is smallest-ever remote-controlled walking robot

    Northwestern University engineers have developed the smallest-ever remote-controlled walking robot — and it comes in the form of a tiny, adorable peekytoe crab.
    Just a half-millimeter wide, the tiny crabs can bend, twist, crawl, walk, turn and even jump. The researchers also developed millimeter-sized robots resembling inchworms, crickets and beetles. Although the research is exploratory at this point, the researchers believe their technology might bring the field closer to realizing micro-sized robots that can perform practical tasks inside tightly confined spaces.
    The research will be published on Wednesday (May 25) in the journal Science Robotics. Last September, the same team introduced a winged microchip that was the smallest-ever human-made flying structure.
    “Robotics is an exciting field of research, and the development of microscale robots is a fun topic for academic exploration,” said John A. Rogers, who led the experimental work. “You might imagine micro-robots as agents to repair or assemble small structures or machines in industry or as surgical assistants to clear clogged arteries, to stop internal bleeding or to eliminate cancerous tumors — all in minimally invasive procedures.”
    “Our technology enables a variety of controlled motion modalities and can walk with an average speed of half its body length per second,” added Yonggang Huang, who led the theoretical work. “This is very challenging to achieve at such small scales for terrestrial robots.”
    A pioneer in bioelectronics, Rogers is the Louis Simpson and Kimberly Querrey Professor of Materials Science and Engineering, Biomedical Engineering and Neurological Surgery at Northwestern’s McCormick School of Engineering and Feinberg School of Medicine and the director of the Querrey Simpson Institute for Bioelectronics (QSIB). Huang is the Jan and Marcia Achenbach Professor of Mechanical Engineering and Civil and Environmental Engineering at McCormick and key member of QSIB. More

  • in

    Researchers teleport quantum information across rudimentary quantum network

    Researchers in Delft have succeeded in teleporting quantum information across a rudimentary network. This first of its kind is an important step towards a future quantum Internet. This breakthrough was made possible by a greatly improved quantum memory and enhanced quality of the quantum links between the three nodes of the network. The researchers, working at QuTech — a collaboration between Delft University of Technology and the Netherlands Organisation for Applied Scientific Research (TNO) — are publishing their findings today in the scientific journal Nature.
    The power of a future quantum Internet is based on the ability to send quantum information (quantum bits) between the nodes of the network. This will enable all kinds of applications such as securely sharing confidential information, linking several quantum computers together to increase their computing capability, and the use of highly precise, linked quantum sensors.
    Sending quantum information
    The nodes of such a quantum network consist of small quantum processors. Sending quantum information between these processors is no easy feat. One possibility is to send quantum bits using light particles but, due to the inevitable losses in glass fibre cables, in particular over long distances, the light particles will very likely not reach their destination. As it is fundamentally impossible to simply copy quantum bits, the loss of a light particle means that the quantum information is irrecoverably lost.
    Teleportation offers a better way of sending quantum information. The protocol for quantum teleportation owes its name to similarities with teleportation in science-fiction films: the quantum bit disappears on the side of the sender and appears on the side of the receiver. As the quantum bit therefore does not need to travel across the intervening space, there is no chance that it will be lost. This makes quantum teleportation an crucial technique for a future quantum Internet.
    Good control over the system
    In order to be able to teleport quantum bits, several ingredients are required: a quantum entangled link between the sender and receiver, a reliable method for reading out quantum processors, and the capacity to temporarily store quantum bits. Previous research at QuTech demonstrated that it is possible to teleport quantum bits between two adjacent nodes. The researchers at QuTech have now shown for the first time that they can meet the package of requirements and have demonstrated teleportation between non-adjacent nodes, in other words over a network. They teleported quantum bits from node “Charlie” to node “Alice,” with the help of an intermediate node “Bob.” More

  • in

    Roboticists go off road to compile data that could train self-driving ATVs

    Researchers from Carnegie Mellon University took an all-terrain vehicle on wild rides through tall grass, loose gravel and mud to gather data about how the ATV interacted with a challenging, off-road environment.
    They drove the heavily instrumented ATV aggressively at speeds up to 30 miles an hour. They slid through turns, took it up and down hills, and even got it stuck in the mud — all while gathering data such as video, the speed of each wheel and the amount of suspension shock travel from seven types of sensors.
    The resulting dataset, called TartanDrive, includes about 200,000 of these real-world interactions. The researchers believe the data is the largest real-world, multimodal, off-road driving dataset, both in terms of the number of interactions and types of sensors. The five hours of data could be useful for training a self-driving vehicle to navigate off road.
    “Unlike autonomous street driving, off-road driving is more challenging because you have to understand the dynamics of the terrain in order to drive safely and to drive faster,” said Wenshan Wang, a project scientist in the Robotics Institute (RI).
    Previous work on off-road driving has often involved annotated maps, which provide labels such as mud, grass, vegetation or water to help the robot understand the terrain. But that sort of information isn’t often available and, even when it is, might not be useful. A map area labeled as “mud,” for example, may or may not be drivable. Robots that understand dynamics can reason about the physical world.
    The research team found that the multimodal sensor data they gathered for TartanDrive enabled them to build prediction models superior to those developed with simpler, nondynamic data. Driving aggressively also pushed the ATV into a performance realm where an understanding of dynamics became essential, said Samuel Triest, a second-year master’s student in robotics.
    “The dynamics of these systems tend to get more challenging as you add more speed,” said Triest, who was lead author on the team’s resulting paper. “You drive faster, you bounce off more stuff. A lot of the data we were interested in gathering was this more aggressive driving, more challenging slopes and thicker vegetation because that’s where some of the simpler rules start breaking down.”
    Though most work on self-driving vehicles focuses on street driving, the first applications likely will be off road in controlled access areas, where the risk of collisions with people or other vehicles is limited. The team’s tests were performed at a site near Pittsburgh that CMU’s National Robotics Engineering Center uses to test autonomous off-road vehicles. Humans drove the ATV, though they used a drive-by-wire system to control steering and speed.
    “We were forcing the human to go through the same control interface as the robot would,” Wang said. “In that way, the actions the human takes can be used directly as input for how the robot should act.”
    Triest will present the TartanDrive study at the International Conference on Robotics and Automation (ICRA) this week in Philadelphia. In addition to Triest and Wang, the research team included Sebastian Scherer, associate research professor in the RI; Aaron Johnson, an assistant professor of mechanical engineering; Sean J. Wang, a Ph.D. student in mechanical engineering; and Matthew Sivaprakasam, a computer engineering student at the University of Pittsburgh.
    Story Source:
    Materials provided by Carnegie Mellon University. Original written by Byron Spice. Note: Content may be edited for style and length. More