More stories

  • in

    Its curvature foreshadows the next financial bubble

    An international team of interdisciplinary researchers has identified mathematical metrics to characterize the fragility of financial markets. Their paper “Network geometry and market instability” sheds light on the higher-order architecture of financial systems and allows analysts to identify systemic risks like market bubbles or crashes.
    With the recent rush of small investors into so-called meme stocks and reemerging interest in cryptocurrencies talk of market instability, rising volatility, and bursting bubbles is surging. However, “traditional economic theories cannot foresee events like the US subprime mortgage collapse of 2007” according to study author Areejit Samal. He and his colleagues from more than ten mathematics, physics, economics, and complex systems focused institutions around the globe have made a great stride in characterizing stock market instability.
    Their paper abstracts the complexity of the financial market into a network of stocks and employs geometry-inspired network measures to gauge market fragility and financial dynamics. They analyzed and contrasted the stock market networks for the USA S&P500 and the Japanese Nikkei-225 indices for a 32-year period (1985-2016) and for the first time were able to show that several discrete Ricci curvatures are excellent indicators of market instabilities. The work was recently published in the Royal Society Open Science journal and allows analysts to distinguish between ‘business-as-usual’ periods and times of fragility like bubbles or market crashes.
    The network created by connecting stocks with highly correlated prices and trading volumes forms the structural basis of their work. The researchers then employ four discrete curvatures, developed by the director of Max Planck Institute for Mathematics in the Sciences Jürgen Jost and his coworkers, to study the changes in the structure of stock market networks over time. Their comparisons to other market stability metrics have shown that their four notions of curvature serve as generic indicators of market instability.
    One curvature candidate, the Forman-Ricci curvature (FRE), has a particularly high correlation with traditional financial indicators and can accurately capture market fear (volatility) and fragility (risk). Their study confirms that in normal trading periods the market is very fragmented, whereas in times of bubbles and impending market crashes correlations between stocks become more uniform and highly interconnected. The FRE is sensitive to both sector-driven and global market fluctuations and whereas common indicators like the returns remain inconspicuous, network curvatures expose these dynamics and reach extreme values during a bubble. Thus, the FRE can capture the interdependencies within and between sectors that facilitate the spreading of perturbations and increase the danger of market crashes.
    Max Planck Institute for Mathematics in the Sciences director Jürgen Jost summarizes the struggle of analyzing market fragility: “there are no easy definitions of a market crash or bubble and merely monitoring established market indices or log-returns does not suffice, but our methodology offers a powerful tool for continuously scanning market risk and thus the health of the financial system.” The insights gained by this study can help decision-makers to better understand systemic risk and identify tipping points, which can potentially forecast coming financial crises or possibly even avoid them altogether.
    Story Source:
    Materials provided by Max-Planck-Gesellschaft. Note: Content may be edited for style and length. More

  • in

    Expressing some doubts about android faces

    Researchers from the Graduate School of Engineering and Symbiotic Intelligent Systems Research Center at Osaka University used motion capture cameras to compare the expressions of android and human faces. They found that the mechanical facial movements of the robots, especially in the upper regions, did not fully reproduce the curved flow lines seen in the faces of actual people. This research may lead to more lifelike and expressive artificial faces.
    The field of robotics has advanced a great deal over the past decades. However, while current androids can appear very humanlike at first, their active facial expressions may still be unnatural and slightly unsettling to us. The exact reasons for this effect have been difficult to pinpoint. Now, a research team at Osaka University has used motion capture technology to monitor the facial expressions of five android faces and compared the results with actual human facial expressions. This was accomplished with six infrared cameras that monitored reflection markers at 120 frames per second and allowed the motions to be represented as three-dimensional displacement vectors.
    “Advanced artificial systems can be difficult to design because the numerous components have complex interactions with each other. The appearance of an android face can experience surface deformations that are hard to control,” study first author Hisashi Ishihara says. These deformations can be due to interactions between components such as the soft skin sheet and the skull-shaped structure, as well as the mechanical actuators.
    The first difference the team found between the androids and adult males was in their flow lines, especially the eye and forehead areas. These lines tended to be almost straight for the androids but were curved for the human adult males. Another major difference was with the skin surface undulation patterns in the upper part of the face.
    “Redesigning the face of androids so that the skin flow pattern resembles that of humans may reduce the discomfort induced by the androids and improve their emotional communication performance,” senior author Minoru Asada says. “Future work may help give the android faces the same level of expressiveness as humans have. Each robot may even have its own individual “personality” that will help people feel more comfortable.”
    Story Source:
    Materials provided by Osaka University. Note: Content may be edited for style and length. More

  • in

    People affected by COVID-19 are being nicer to machines

    People are not very nice to machines. The disdain goes beyond the slot machine that emptied your wallet, a dispenser that failed to deliver a Coke or a navigation system that took you on an unwanted detour.
    Yet USC researchers report that people affected by COVID-19 are showing more goodwill — to humans and to human-like autonomous machines.
    “The new discovery here is that when people are distracted by something distressing, they treat machines socially like they would treat other people. We found greater faith in technology due to the pandemic and a closing of the gap between humans and machines,” said Jonathan Gratch, senior author of the study and director for virtual humans research at the USC Institute for Creative Technologies.
    The findings, which appeared recently in the journal iScience, come from researchers at USC, George Mason University and the U.S. Department of Defense.
    The scientists noted that, in general, people mostly dispense with social norms of human interaction and treat machines differently. The behavior holds even as machines become more humanlike; think Alexa, the persona in your vehicle navigation system or other virtual assistants. This is because human default behavior is often driven by heuristic thinking — the snap judgments people use to navigate complex daily interactions.
    In studying human-machine interactions, the researchers noted that people impacted by COVID-19 also displayed more altruism both toward other people and to machines. More

  • in

    A promising breakthrough for a better design of electronic materials

    Finding the best materials for tomorrow’s electronics is the goal of Professor Emanuele Orgiu of the Institut national de la recherche scientifique (INRS). Among the materials in which Professor Orgiu is interested, some are made of molecules that can conduct electricity. He has demonstrated the role played by molecular vibrations on electron conductivity on crystals of such materials. This finding is important for applications of these molecular materials in electronics, energy and information storage. The study, conducted in collaboration with a team from the INRS and the University of Strasbourg (France), was published in the Advanced Materials journal.
    Scientists were interested in observing the relationship between the structure of materials and their ability to conduct electricity. To this end, they measured the speed of propagation of electrons in crystals formed by these molecules. In their study, the authors compared two perylene diimide derivatives, which are semiconducting molecules of interest because of their use on flexible devices, smart clothes or foldable electronics. The two compounds encompassed within the study have similar chemical structure but featured very different conduction properties.
    With the goal of determining what caused this difference, the research group was able to establish that the different molecular vibrations composing the material were responsible for the different electrical behaviour observed in devices. “For a current to flow through a material, electrons must ‘hop’ from one molecule to the neighbouring one. Depending on the level of ‘movement’ of the molecules, which depends on the amplitude and energy of the related vibrations (called phonons), the electrons can move more or less easily through the material,” explains Professor Orgiu, whose research team is the first to demonstrate which vibrations have the greatest influence on electron flows.
    An Ad Hoc Molecular Design to Make Electrons Travel Faster
    This breakthrough paves the way for the development of even more efficient materials for electronics. “By knowing what type of vibrations allows charges to move more easily, we are providing chemists with a formula for synthesizing the right materials, rather than going in blindly,” explains Marc-Antoine Stoeckel. This research opens up new applications that could not be envisaged with silicon, the most widely used material in electronics, including computers.
    Professor Orgiu collaborated with INRS Professor Luca Razzari to measure the vibrations of the molecules. The two researchers are now working on a new spectroscopic technique that would enable them to visualize the vibrations when electrons are present. This will allow them to see if charges affect molecular vibrations.
    Story Source:
    Materials provided by Institut national de la recherche scientifique – INRS. Original written by Audrey-Maude Vézina. Note: Content may be edited for style and length. More

  • in

    Better batteries start with basics — and a big computer

    To understand the fundamental properties of an industrial solvent, chemists with the University of Cincinnati turned to a supercomputer.
    UC chemistry professor and department head Thomas Beck and UC graduate student Andrew Eisenhart ran quantum simulations to understand glycerol carbonate, a compound used in biodiesel and as a common solvent.
    They found that the simulation provided detail about hydrogen bonding in determining the structural and dynamic properties of the liquid that was missing from classical models. The study was published in the Journal of Physical Chemistry B.
    Glycerol carbonate could be a more environmentally friendly chemical solvent for things like batteries. But chemists have to know more about what’s going on in these solutions. They studied the compounds potassium fluoride and potassium chloride.
    “The study we did gives us a fundamental understanding of how small changes to a molecular structure can have larger consequences for the solvent as a whole,” Eisenhart said. “And how these small changes make its interactions with very important things like ions and can have an effect on things like battery performance.”
    Water is a seemingly simple solvent, as anyone who has stirred sugar in their coffee can attest. More

  • in

    Solving 'barren plateaus' is the key to quantum machine learning

    Many machine learning algorithms on quantum computers suffer from the dreaded “barren plateau” of unsolvability, where they run into dead ends on optimization problems. This challenge had been relatively unstudied — until now. Rigorous theoretical work has established theorems that guarantee whether a given machine learning algorithm will work as it scales up on larger computers.
    “The work solves a key problem of useability for quantum machine learning. We rigorously proved the conditions under which certain architectures of variational quantum algorithms will or will not have barren plateaus as they are scaled up,” said Marco Cerezo, lead author on the paper published in Nature Communications today by a Los Alamos National Laboratory team. Cerezo is a post doc researching quantum information theory at Los Alamos. “With our theorems, you can guarantee that the architecture will be scalable to quantum computers with a large number of qubits.”
    “Usually the approach has been to run an optimization and see if it works, and that was leading to fatigue among researchers in the field,” said Patrick Coles, a coauthor of the study. Establishing mathematical theorems and deriving first principles takes the guesswork out of developing algorithms.
    The Los Alamos team used the common hybrid approach for variational quantum algorithms, training and optimizing the parameters on a classical computer and evaluating the algorithm’s cost function, or the measure of the algorithm’s success, on a quantum computer.
    Machine learning algorithms translate an optimization task — say, finding the shortest route for a traveling salesperson through several cities — into a cost function, said coauthor Lukasz Cincio. That’s a mathematical description of a function that will be minimized. The function reaches its minimum value only if you solve the problem.
    Most quantum variational algorithms initiate their search randomly and evaluate the cost function globally across every qubit, which often leads to a barren plateau. More

  • in

    Researchers help keep pace with Moore's Law by exploring a new material class

    Progress in the field of integrated circuits is measured by matching, exceeding, or falling behind the rate set forth by Gordon Moore, former CEO and co-founder of Intel, who said the number of electronic components, or transistors, per integrated circuit would double every year. That was more than 50 years ago, and surprisingly his prediction, now called Moore’s Law, came true.
    In recent years, it was thought that the pace had slowed; one of the biggest challenges of putting more circuits and power on a smaller chip is managing heat.
    A multidisciplinary group that includes Patrick E. Hopkins, a professor in the University of Virginia’s Department of Mechanical and Aerospace Engineering, and Will Dichtel, a professor in Northwestern University’s Department of Chemistry, is inventing a new class of material with the potential to keep chips cool as they keep shrinking in size — and to help Moore’s Law remain true. Their work was recently published in Nature Materials.
    Electrical insulation materials that minimize electrical crosstalk in chips are called “low-k” dielectrics. This material type is the silent hero that makes all electronics possible by steering the current to eliminate signal erosion and interference; ideally, it can also pull damaging heat caused by electrical current away from the circuitry. The heat problem becomes exponential as the chip gets smaller because not only are there more transistors in a given area, which makes more heat in that same area, they are closer together, which makes it harder for heat to dissipate.
    “Scientists have been in search of a low-k dielectric material that can handle the heat transfer and space issues inherent at much smaller scales,” Hopkins said. “Although we’ve come a long way, new breakthroughs are just not going to happen unless we combine disciplines. For this project we’ve used research and principles from several fields — mechanical engineering, chemistry, materials science, electrical engineering — to solve a really tough problem that none of us could work out on our own.”
    Hopkins is one of the leaders of UVA Engineering’s Multifunctional Materials Integration initiative, which brings together researchers from multiple engineering disciplines to formulate materials with a wide array of functionalities. More

  • in

    New statistical model predicts which cities could become 'superspreaders'

    Researchers have developed a new statistical model that predicts which cities are more likely to become infectious disease hotspots, based both on interconnectivity between cities and the idea that some cities are more suitable environments for infection than others. Brandon Lieberthal and Allison Gardner of the University of Maine present these findings in the open-access journal PLOS Computational Biology.
    In an epidemic, different cities have varying risks of triggering superspreader events, which spread unusually large numbers of infected people to other cities. Previous research has explored how to identify potential “superspreader cities” based on how well each city is connected to others or on each city’s distinct suitability as an environment for infection. However, few studies have incorporated both factors at once.
    Now, Lieberthal and Gardner have developed a mathematical model that identifies potential superspreaders by incorporating both connectivity between cities and their varying suitability for infection. A city’s infection suitability depends on the specific disease being considered, but could incorporate characteristics such as climate, population density, and sanitation.
    The researchers validated their model with a simulation of epidemic spread across randomly generated networks. They found that the risk of a city becoming a superspreader increases with infection suitability only up to a certain extent, but risk increases indefinitely with increased connectivity to other cities.
    “Most importantly, our research produces a formula in which a disease management expert can input the properties of an infectious disease and the human mobility network and output a list of cities that are most likely to become superspreader locations,” Lieberthal says. “This could improve efforts to prevent or mitigate spread.”
    The new model can be applied to both directly transmitted diseases, such as COVID-19, or to vector-borne illnesses, such as the mosquito-borne Zika virus. It could provide more in-depth guidance than traditional metrics of risk, but is also much less computationally intensive than advanced simulations.
    Story Source:
    Materials provided by PLOS. Note: Content may be edited for style and length. More