More stories

  • in

    Researchers look to human 'social sensors' to better predict elections and other trends

    Election outcomes are notoriously difficult to predict. In 2016, for example, most polls suggested that Hillary Clinton would win the presidency, but Donald Trump defeated her. Researchers cite multiple explanations for the unreliability in election forecasts — some voters are difficult to reach, and some may wish to remain hidden. Among those who do respond to surveys, some may change their minds after being polled, while others may be embarrassed or afraid to report their true intentions.
    In a new perspective piece for Nature, Santa Fe Institute researchers Mirta Galesic, Jonas Dalege, Henrik Olsson, Daniel Stein, Tamara van der Does, and their collaborators* propose a surprising way to get around these shortcomings in survey design — not just in the world of politics, but in other types of research as well. While it’s widely assumed that cognitive bias clouds our assessment of the people around us, their research and that of others suggests that in fact, our estimations of what our friends and family believe are often accurate.
    “We realized that if we ask a national sample of people about who their friends are going to vote for, we get more accurate predictions than if we ask them who they’re going to vote for,” says Galesic, who is the corresponding author. “We found that people are actually pretty good at estimating the beliefs of people around them.”
    That means researchers can gather highly accurate information about social trends and groups by asking about a person’s social circle rather than interrogating their own individual beliefs. That’s because as highly social creatures, we have become very good at sizing up those around us — what researchers call “social sensing.”
    When people are selected to represent a particular group, their perceptions, combined with new computational models of human social dynamics, can be used to identify emerging trends and better predict political and health-related developments in particular, the team writes. This approach, combining elements of psychology and sociology, can even be harnessed to devise interventions that “could steer social systems in different directions” after a major event, such as a natural disaster or a mass shooting, they suggest.
    “I really hope human social sensing will be included in the standard social science toolbox, because I think it can be a very useful strategy for predicting and modeling societal trends,” Galesic says.
    * Mirta Galesic (Santa Fe Institute), Wändi Bruine de Bruin (University of Southern California), Jonas Dalege (Santa Fe Institute), Scott Feld (Purdue University); Frauke Kreuter (LMU Munich, University of Maryland); Henrik Olsson (Santa Fe Institute); Drazen Prelec (Sloan School of Management, MIT); Daniel Stein (New York University, Santa Fe Institute), and Tamara van der Does (Santa Fe Institute) are co-authors on the perspective piece.
    Story Source:
    Materials provided by Santa Fe Institute. Note: Content may be edited for style and length. More

  • in

    New research lifts the clouds on land clearing and biodiversity loss

    QUT researchers have developed a new machine learning mathematical system that helps to identify and detect changes in biodiversity, including land clearing, when satellite imagery is obstructed by clouds.
    Using statistical methods to quantify uncertainty, the research, published in Remote Sensing in Ecology and Conservation, analysed available satellite images of an 180km square area in central south-east Queensland.
    The region is home to many native species including the critically endangered northern hairy-nosed wombat and the vulnerable greater glider, and the area mainly consists of forest, pasture, and agricultural land.
    Dr Jacinta Holloway-Brown says measuring changes in forest cover over time is essential to track and preserve habitats and is a key sustainable development goal by the United Nations and World Bank to manage forests sustainably.
    “Satellite imagery is important as it is too difficult and expensive to frequently collect field data over large, forested areas,” Dr Holloway-Brown said.
    “The problem with using satellite imagery is large portions of the earth are obscured by clouds and this cloud cover causes large and frequent amounts of missing data.”
    Dr Holloway-Brown said it was estimated based on 12 years of satellite imagery on average approximately 67 per cent of the earth is obscured by cloud cover. More

  • in

    Thinking in 3D improves mathematical skills

    Spatial reasoning ability in small children reflects how well they will perform in mathematics later. Researchers from the University of Basel recently came to this conclusion, making the case for better cultivation of spatial reasoning.
    Good math skills open career doors in the natural sciences as well as technical and engineering fields. However, a nationwide study on basic skills conducted in Switzerland in 2019 found that schoolchildren achieved only modest results in mathematics. But it seems possible to begin promoting math skills from a young age, as Dr. Wenke Möhring’s team of researchers from the University of Basel reported after studying nearly 600 children.
    The team found a correlation between children’s spatial sense at the age of three and their mathematical abilities in primary school. “We know from past studies that adults think spatially when working with numbers — for example, represent small numbers to the left and large ones to the right,” explains Möhring. “But little research has been done on how spatial reasoning at an early age affects children’s learning and comprehension of mathematics later.”
    The study, which was published in the journal Learning and Instruction, suggests that there is a strong correlation between early spatial skills and the comprehension of mathematical concepts later. The researchers also ruled out the possibility that this correlation is due to other factors, such as socio-economic status or language ability. Exactly how spatial ability affects mathematical skills in children is still unclear, but the spatial conception of numbers might play a role.
    The findings are based on the analysis of data from 586 children in Basel, Switzerland. As part of a project on language acquisition of German as a second language, the researchers gave three-year-old children a series of tasks to test cognitive, socio-emotional and spatial abilities. For example, the children were asked to arrange colored cubes in certain shapes. The researchers repeated these tests four times at an interval of about 15 months and compared the results with the academic performance of seven-year-old children in the first grade.
    The researchers also closely examined whether the pace of development, i.e. particularly rapid development of spatial abilities, can predict future mathematical ability. Past studies with a small sample size had found a correlation, but Möhring and her colleagues were unable to confirm this in their own study. Three-year-old children who started out with low spatial abilities improved them faster in the subsequent years, but still performed at a lower level in mathematics when they were seven years old. Despite faster development, by the time they began school these children had still not fully caught up with the children possessing higher initial spatial reasoning skills.
    “Parents often push their children in the area of language skills,” says Möhring. “Our results suggest how important it is to cultivate spatial reasoning at an early age as well.” There are simple ways to do this, such as using “spatial language” (larger, smaller, same, above, below) and toys — e.g. building blocks — that help improve spatial reasoning ability.
    Spatial reasoning and gender
    The researchers found that boys and girls are practically indistinguishable in terms of their spatial reasoning ability at the age of three, but in subsequent years this develops more slowly in girls. Möhring and her colleagues suspect that boys may hear more “spatial language” and that toys typically designed for boys often promote spatial reasoning, whereas toys for girls focus mainly on social skills. Children may also internalize their parents’ and teacher’s expectations and then, as they grow up, live up to stereotypes — for example, that women do not perform as well in the areas of spatial reasoning and mathematics as men.
    Story Source:
    Materials provided by University of Basel. Note: Content may be edited for style and length. More

  • in

    Technology only two atoms thick could enable storage of information in thinnest unit

    Researchers from Tel Aviv University have engineered the world’s tiniest technology, with a thickness of only two atoms. According to the researchers, the new technology proposes a way for storing electric information in the thinnest unit known to science, in one of the most stable and inert materials in nature. The allowed quantum-mechanical electron tunneling through the atomically thin film may boost the information reading process much beyond current technologies.
    The research was performed by scientists from the Raymond and Beverly Sackler School of Physics and Astronomy and Raymond and Beverly Sackler School of Chemistry. The group includes Maayan Vizner Stern, Yuval Waschitz, Dr. Wei Cao, Dr. Iftach Nevo, Prof. Eran Sela, Prof. Michael Urbakh, Prof. Oded Hod, and Dr. Moshe Ben Shalom. The work is now published in Science magazine.
    “Our research stems from curiosity about the behavior of atoms and electrons in solid materials, which has generated many of the technologies supporting our modern way of life,” says Dr. Ben Shalom. “We (and many other scientists) try to understand, predict, and even control the fascinating properties of these particles as they condense into an ordered structure that we call a crystal. At the heart of the computer, for example, lies a tiny crystalline device designed to switch between two states indicating different responses — “yes” or “no,” “up” or “down” etc. Without this dichotomy — it is not possible to encode and process information. The practical challenge is to find a mechanism that would enable switching in a small, fast, and inexpensive device.
    Current state-of-the-art devices consist of tiny crystals that contain only about a million atoms (about a hundred atoms in height, width, and thickness) so that a million of these devices can be squeezed about a million times into the area of one coin, with each device switching at a speed of about a million times per second.
    Following the technological breakthrough, the researchers were able, for the first time, to reduce the thickness of the crystalline devices to two atoms only. Dr. Ben Shalom emphasizes that such a thin structure enables memories based on the quantum ability of electrons to hop quickly and efficiently through barriers that are just several atoms thick. Thus, it may significantly improve electronic devices in terms of speed, density, and energy consumption.
    In the study, the researchers used a two-dimensional material: one-atom-thick layers of boron and nitrogen, arranged in a repetitive hexagonal structure. In their experiment, they were able to break the symmetry of this crystal by artificially assembling two such layers. “In its natural three-dimensional state, this material is made up of a large number of layers placed on top of each other, with each layer rotated 180 degrees relative to its neighbors (antiparallel configuration)” says Dr. Ben Shalom. “In the lab, we were able to artificially stack the layers in a parallel configuration with no rotation, which hypothetically places atoms of the same kind in perfect overlap despite the strong repulsive force between them (resulting from their identical charges). In actual fact, however, the crystal prefers to slide one layer slightly in relation to the other, so that only half of each layer’s atoms are in perfect overlap, and those that do overlap are of opposite charges — while all others are located above or below an empty space — the center of the hexagon. In this artificial stacking configuration the layers are quite distinct from one another. For example, if in the top layer only the boron atoms overlap, in the bottom layer it’s the other way around.”
    Dr. Ben Shalom also highlights the work of the theory team, who conducted numerous computer simulations “Together we established deep understanding of why the system’s electrons arrange themselves just as we had measured in the lab. Thanks to this fundamental understanding, we expect fascinating responses in other symmetry-broken layered systems as well,” he says.
    Maayan Wizner Stern, the PhD student who led the study, explains: “The symmetry breaking we created in the laboratory, which does not exist in the natural crystal, forces the electric charge to reorganize itself between the layers and generate a tiny internal electrical polarization perpendicular to the layer plane. When we apply an external electric field in the opposite direction the system slides laterally to switch the polarization orientation. The switched polarization remains stable even when the external field is shut down. In this the system is similar to thick three-dimensional ferroelectric systems, which are widely used in technology today.”
    “The ability to force a crystalline and electronic arrangement in such a thin system, with unique polarization and inversion properties resulting from the weak Van der Waals forces between the layers, is not limited to the boron and nitrogen crystal,” adds Dr. Ben Shalom. “We expect the same behaviors in many layered crystals with the right symmetry properties. The concept of interlayer sliding as an original and efficient way to control advanced electronic devices is very promising, and we have named it Slide-Tronics.”
    Maayan Vizner Stern concludes: “We are excited about discovering what can happen in other states we force upon nature and predict that other structures that couple additional degrees of freedom are possible. We hope that miniaturization and flipping through sliding will improve today’s electronic devices, and moreover, allow other original ways of controlling information in future devices. In addition to computer devices, we expect that this technology will contribute to detectors, energy storage and conversion, interaction with light, etc. Our challenge, as we see it, is to discover more crystals with new and slippery degrees of freedom.”
    The study was funded through support from the European Research Council (ERC starting grant), the Israel Science Foundation (ISF), and the Ministry of Science and Technology (MOST). More

  • in

    Novel heat-management material keeps computers running cool

    UCLA engineers have demonstrated successful integration of a novel semiconductor material into high-power computer chips to reduce heat on processors and improve their performance. The advance greatly increases energy efficiency in computers and enables heat removal beyond the best thermal-management devices currently available.
    The research was led by Yongjie Hu, an associate professor of mechanical and aerospace engineering at the UCLA Samueli School of Engineering. Nature Electronics recently published the finding in this article.
    Computer processors have shrunk down to nanometer scales over the years, with billions of transistors sitting on a single computer chip. While the increased number of transistors helps make computers faster and more powerful, it also generates more hot spots in a highly condensed space. Without an efficient way to dissipate heat during operation, computer processors slow down and result in unreliable and inefficient computing. In addition, the highly concentrated heat and soaring temperatures on computer chips require extra energy to prevent processers from overheating.
    In order to solve the problem, Hu and his team had pioneered the development of a new ultrahigh thermal-management material in 2018. The researchers developed defect-free boron arsenide in their lab and found it to be much more effective in drawing and dissipating heat than other known metal or semiconductor materials such as diamond and silicon carbide. Now, for the first time, the team has successfully demonstrated the material’s effectiveness by integrating it into high-power devices.
    In their experiments, the researchers used computer chips with state-of-the-art, wide bandgap transistors made of gallium nitride called high-electron-mobility transistors (HEMTs). When running the processors at near maximum capacity, chips that used boron arsenide as a heat spreader showed a maximum heat increase from room temperatures to nearly 188 degrees Fahrenheit. This is significantly lower than chips using diamond to spread heat, with temperatures rising to approximately 278 degrees Fahrenheit, or the ones with silicon carbide showing a heat increase to about 332 degrees Fahrenheit.
    “These results clearly show that boron-arsenide devices can sustain much higher operation power than processors using traditional thermal-management materials,” Hu said. “And our experiments were done under conditions where most current technologies would fail. This development represents a new benchmark performance and shows great potential for applications in high-power electronics and future electronics packaging.”
    According to Hu, boron arsenide is ideal for heat management because it not only exhibits excellent thermal conductivity but also displays low heat-transport resistance.
    “When heat crosses a boundary from one material to another, there’s typically some slowdown to get into the next material,” Hu said. “The key feature in our boron arsenide material is its very low thermal- boundary resistance. This is sort of like if the heat just needs to step over a curb, versus jumping a hurdle.”
    The team has also developed boron phosphide as another excellent heat-spreader candidate. During their experiments, the researchers first illustrated the way to build a semiconductor structure using boron arsenide and then integrated the material into a HEMT-chip design. The successful demonstration opens up a path for industry adoption of the technology.
    Story Source:
    Materials provided by University of California – Los Angeles. Note: Content may be edited for style and length. More

  • in

    Bronze Age: how the market began

    Knowing the weight of a commodity provides an objective way to value goods in the marketplace. But did a self-regulating market even exist in the Bronze Age? And what can weight systems tell us about this? A team of researchers from the University of Göttingen researched this by investigating the dissemination of weight systems throughout Western Eurasia. Their new simulation indicates that the interaction of merchants, even without substantial intervention from governments or institutions, is likely to explain the spread of Bronze Age technology to weigh goods. The results were published in Proceedings of the National Academy of Sciences (PNAS).
    To determine how different units of weight emerged in different regions, researchers compared all the weight systems in use between Western Europe and the Indus Valley from 3,000-1,000 BC. Analysis of 2,274 balance weights from 127 sites revealed that, with the exception of those from the Indus Valley, new and very similar units of weight appeared in a gradual spread west of Mesopotamia. To find out if the gradual formation of these systems could be due to propagation of error from a single weight system, the researchers modelled the creation of 100 new units. Taking into account factors such as measurement error, the simulation supported a single origin between Mesopotamia and Europe. It also showed that the Indus Valley probably developed an independent weight system. The research demonstrated that if information flow in Eurasia trade was free enough to support a common weight system, it was likely to be sufficient to react to local price fluctuations.
    The weight systems that emerged between Mesopotamia and Europe were very similar. This meant that a single merchant could travel, for instance, from Mesopotamia to the Aegean and from there to Central Europe and never need to change their own set of weights. The merchant could trade with foreign partners while simply relying on approximating the weights. There was no international authority that could have regulated the accuracy of weight systems over such a wide territory and long time span. In Europe, beyond the Aegean, centralised authorities did not even exist at this time. The researchers conclude that the emergence of accurate weight systems must have been the outcome of a global network regulating itself from the bottom-up.
    “With the results of our statistical analysis and experimental tests, it is now possible to prove the long-held hypothesis that free entrepreneurship was already a primary driver of the world economy even as early as the Bronze Age,” explains Professor Lorenz Rahmstorf from the Institute for Prehistory and Early History, University of Göttingen. Merchants could interact freely, establish profitable partnerships, and take advantage of the opportunities offered by long-distance trade. “The idea of a self-regulating market existing some 4,000 years ago puts a new perspective on the global economy of the modern era,” says Dr Nicola Ialongo, University of Göttingen. He adds, “Try to imagine all the international institutions that currently regulate our modern world economy: is global trade possible thanks to these institutions, or in spite of them?”
    Story Source:
    Materials provided by University of Göttingen. Note: Content may be edited for style and length. More

  • in

    Fungi embrace fundamental economic theory as they engage in trading

    When you think about trade and market relationships, you might think about brokers yelling at each other on the floor of a stock exchange on Wall Street. But it seems one of the basic functions of a free market is quietly practiced by fungi.
    New research from a Rice University economist suggests certain networks of fungi embrace an important economic theory as they engage in trading nutrients for carbon with their host plants. This finding could aid the understanding of carbon storage in soils, an important tool in mitigating climate change.
    A research paper entitled “Walrasian equilibrium behavior in nature” is available online and will appear in an upcoming edition of Proceedings of the National Academy of Sciences. Ted Loch-Temzelides, a professor of economics and the George and Cynthia Mitchell Chair in Sustainable Development at Rice, examined through an economic lens data from ecological experiments on arbuscular mycorrhizal fungi networks, which connect to plants and facilitate the trading of nutrients for carbon.
    Loch-Temzelides found that these relationships resemble how economists think about competitive — also known as Walrasian — markets. The paper demonstrates that Walrasian equilibrium, a leading concept in the economic theory of markets used to make predictions, can also be used to understand trade in this “biological market.”
    “Far from being self-sacrificing, organisms such as fungi can exhibit competitive behavior similar to that in markets involving sophisticated human participants,” Loch-Temzelides said.
    His finding also implies that resources are allocated to the maximum benefit of the market participants — in this case, fungi and plants.
    “Mycorrhizal fungi networks around the world are estimated to sequester around 5 billion tons of carbon per year,” Loch-Temzelides said. “Manipulating the terms of trade so that carbon obtained from host plants becomes less expensive compared to nutrients could lead to additional carbon being stored in the soil, which could provide major benefits in fighting climate change.”
    Loch-Temzelides hopes future research by biologists and economists can make progress on better understanding these interactions.
    Story Source:
    Materials provided by Rice University. Note: Content may be edited for style and length. More

  • in

    3 things to know about the record-smashing heat wave baking the Pacific Northwest

    Like a lid on a steaming pot, a high-pressure system is sitting over the U.S. Pacific Northwest and British Columbia, Canada, sending temperatures in the region soaring to unprecedented heights.

    From a historic perspective, the event is so rare and extreme as to be a once in a millennium heat wave. But one consequence of Earth’s rapidly changing climate is that such extreme events will become much more common in the region in future, says Larry O’Neill, a climate scientist at Oregon State University in Corvallis.

    Temperatures in Portland, Ore., reached 115° Fahrenheit (46° Celsius) on June 29, the highest temperature recorded there since record-keeping began in 1940; average high temperatures for this time of year are about 73° F (23° C). Similar records were notched across the region and more are expected to be set as the high pressure system slowly slides east.

    The heat was so extreme it melted transit power cables for Portland’s cable cars and caused asphalt and concrete roads in western Washington to expand and crack. Such high temperatures are particularly dangerous in a normally cool region little used to or prepared for it, raising the risk of heat-related deaths and other health hazards (SN: 4/3/18). Ground-level ozone levels, for instance, also reached the highest seen yet in 2021, the chemical reactions that form the gas amped up by a potent mix of high heat and strong ultraviolet light.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    O’Neill talked to Science News about three things to know about the heat wave.

    1. The heat wave is linked to a stalled kink in the jet stream.

    Jet streams, fast-moving currents of air high in the troposphere, encircle both poles, helping to push weather systems around Earth’s surface.  The current isn’t smooth and straight; it can meander and form large swirls, peaks and troughs surrounding zones of high- and low-pressure.

    Occasionally, these weather patterns stall, becoming stationary “blocking events” that keep a particular spate of weather in place for an extended period of time. One such stalled-out high-pressure zone — basically a large dome of hot, dry weather — is now sitting atop the Pacific Northwest.

    The punishing heatwave has an incredible jet stream pattern.The dome of heat will be encircled by the polar jet and this helps lift a sub-tropical jet branch almost into the Canadian Arctic. pic.twitter.com/uIWIIINlSc— Scott Duncan (@ScottDuncanWX) June 25, 2021
    London-based meteorologist Scott Duncan tweets about the unusual heat (top) and the jet stream pattern (bottom) that created that heat dome over the Pacific Northwest. In the jet stream image, hot, dry air (in orange) swirls around and maintains a high-pressure system over the region from June 24 to June 29, locking that hot, dry air in place.

    Historically, similar high-pressure patterns have brought heat waves to the region, O’Neill says. But this one is different. A typical severe heat wave in the past might lead to temperatures of about 100 °F, he says, “not 115 °F.”

    2. Climate change is making the heat wave more severe.

    Baseline temperatures were already higher than in the past, due to Earth’s changing climate. Globally, Earth’s average temperatures are increasing, with 2016 and 2020 tied for the hottest years on record (SN: 1/14/21).

    Those changes are reflected in what’s now officially considered “normal.” In May, for example, the U.S. National Oceanographic and Atmospheric Administration reported that the country’s new baseline reference temperature, or “climate normal,” will be the period from 1991 to 2020 — also now the hottest 30-year period on record for the country (SN: 5/26/21).

    That changing reference makes it tough to place such an unprecedented heat wave in any kind of historical context. “We have a historical data record that’s 100 years long,” O’Neill says. Saying that the heat wave is a once-in-a-millennium event means that “you would expect that, at random chance, this would occur once every 1,000 years. But we’ve never observed this. We have no basis to say this,” he adds. “This is a climate that we’re not accustomed to.”

    3. Climate change is likely to make such extreme events more common in the future.

    A week before the onset of the heat wave, forecasters were predicting such unprecedented temperatures for the region that many people dismissed those predictions as “being ridiculous,” O’Neill says. “Turns out, [the forecasters] were right.”

    Future climate change attribution studies may shed some more light on the ways in which this particular heat wave may be linked to climate change (SN: 7/15/20). Overall, it’s known that climate change is likely to make such extreme events more common in the future, O’Neill says. “We’re seeing these highs form more frequently, and more persistently.” Extreme heat and extreme drought in the U.S. West, for example, can create a reinforcing cycle that exacerbates both (SN: 4/16/20).  

    And that poses many dangers for the planet, not least for human health (SN: 4/3/18). In May, scientists reported in Nature Climate Change that 37 percent of heat-related deaths between 1991 and 2018 were attributable to human-caused climate change.  

    “When we talk about climate change, often the conversation is a little more abstract,” O’Neill says. “We’re experiencing it right now (SN: 11/25/19). And this question about whether we adapt and mitigate — that’s something we have to figure out right now.” More