More stories

  • in

    Impact of online communities

    The Governance Lab (The GovLab) at the NYU Tandon School of Engineering released a report, “The Power of Virtual Communities,” which examines the role online groups play in creating opportunities for people to build new kinds of meaningful communities they often could not form in real space.
    This first-of-its-kind research was built on interviews with 50 Facebook community leaders in 17 countries, 26 global experts from academia and industry, unique access to Facebook’s underlying research and an original global survey conducted by YouGov of 15,000 people in 15 countries who are currently members of online and in-person communities, which found that in 11 of those countries the majority of people said that the most meaningful communities to which they belong are primarily online.
    “Around the world, people who are otherwise voiceless in physical space are becoming powerful leaders of groups that confer a true sense of meaning and belonging for their members,” said Beth Simone Noveck, director of The GovLab. “This brief report, which tells the stories of several of those leaders and how they govern global communities is, we hope, the beginning of greater and much needed study of online groups and their impact on social and political life.”
    Many of these Facebook groups cut across traditional social groupings and bring together people around a shared trait or interest:
    Female IN (FIN), created as a safe space for women in the Nigerian diaspora to discuss and seek support for problems associated with such challenges as relationship struggles, health issues, abuse, grief and loss. Female IN grew by word-of-mouth into a 1.8 million-person community with members in more than 100 countries.
    Surviving Hijab encourages its 920,000 female members to take up or continue wearing the Muslim head covering in the face of political and social criticism.
    Blind PenPals enables its 7,000 blind and visually impaired members to share stories and advice.
    Canterbury Residents Group acts as a public square in the British city of Canterbury and has 38,000 members, about the same size as the city’s population.
    Subtle Asian Traits, which began as a modest initiative among nine young Australians of Chinese background to share funny memes about their Asian heritage, has expanded to a group of 1.82 million people who discuss and share the experience of growing up Asian in mostly majority-White societies.
    The GovLab’s report findings note that:
    Membership in online communities confers a strong sense of community, the lack of physical proximity notwithstanding.
    Online groups are a still fluid form of human organization that in many cases attract members and leaders who are marginalized in the physical societies they inhabit, and who use the platform to build new kinds of communities that would be difficult to form otherwise.
    Many of these groups have counter-cultural norms and are what political scientists might call “cross-cleavage” communities. These groups cut across traditional social groupings, and bring together people normally divided by geography around a shared trait or interest.
    The flexible affordances of online platforms have enabled new kinds of leaders to emerge in these groups with unique skills in moderating often divisive dialogues, sometimes among millions of members.
    Most groups are run as a labor of love; many leaders are neither trained nor paid and the rules that govern their internal operations are often uncodified and the hosting platform — in this case Facebook — holds significant power over their operations and future.
    These groups, some of which have huge memberships, remain emergent and largely unrecognized: they are outside traditional power structures, institutions and forms of governance.
    More research is needed to understand whether and how these groups will operate as genuine communities over the long term, especially given the tensions that derive from conducting public life on a private platform such as Facebook, and how such groups and their leaders can be supported to ensure they provide maximum voice, participation and benefit to their members
    Further, results from the YouGov survey and the interviews with group leaders indicated that the three most essential traits and behaviors for leaders to exhibit were welcoming differences of opinions, being visible and communicating well, and acting ethically at all times.
    This report, published in six languages, further shines a light on the role leaders have and why it is important to further support them in running their community.

    Story Source:
    Materials provided by NYU Tandon School of Engineering. Note: Content may be edited for style and length. More

  • in

    An intelligent soft material that curls under pressure or expands when stretched

    Plants and animals can rapidly respond to changes in their environment, such as a Venus flytrap snapping shut when a fly touches it. However, replicating similar actions in soft robots requires complex mechanics and sensors. Now, researchers reporting in ACS Applied Materials & Interfaces have printed liquid metal circuits onto a single piece of soft polymer, creating an intelligent material that curls under pressure or mechanical strain.
    Ideally, soft robots could mimic intelligent and autonomous behaviors in nature, combining sensing and controlled movement. But the integration of sensors and the moving parts that respond can be clunky or require an external computer. A single-unit design is needed that responds to environmental stimuli, such as mechanical pressure or stretching. Liquid metals could be the solution, and some researchers have already investigated their use in soft robots. These materials can be used to create thin, flexible circuits in soft materials, and the circuits can rapidly produce heat when an electric current is generated, either from an electrical source or from pressure applied to the circuit. When the soft circuits are stretched, the current drops, cooling the material. To make a soft robot capable of autonomous, intelligent movement, Chao Zhao, Hong Liu and colleagues wanted to integrate liquid metal circuits with liquid crystal elastomers (LCE) — polymers that can undergo large changes to their shape when heated or cooled.
    The researchers applied a nickel-infused gallium-indium alloy onto an LCE and magnetically moved the liquid metal into lines to form an uninterrupted circuit. A silicone sealant that changed from pink to dark red when warmed kept the circuit protected and in place. In response to a current, the soft material curled as the temperature increased, and the film turned redder over time. The team used the material to develop autonomous grippers that perceived and responded to pressure or stretching applied to the circuits. The grippers could pick up small round objects and then drop them when the pressure was released or the material was stretched. Finally, the researchers formed the film into a spiral shape. When pressure was applied to the circuit at the bottom of the spiral, it unfurled with a rotating motion, as the spiral’s temperature increased. The researchers say that these pressure- and stretch-sensitive materials could be adapted for use in soft robots performing complex tasks or locomotion.

    Story Source:
    Materials provided by American Chemical Society. Note: Content may be edited for style and length. More

  • in

    Quantum systems learn joint computing

    Researchers realize the first quantum-logic computer operation between two separate quantum modules in different laboratories.
    Today’s quantum computers contain up to several dozen memory and processing units, the so-called qubits. Severin Daiss, Stefan Langenfeld, and colleagues from the Max Planck Institute of Quantum Optics in Garching have successfully interconnected two such qubits located in different labs to a distributed quantum computer by linking the qubits with a 60-meter-long optical fiber. Over such a distance they realized a quantum-logic gate — the basic building block of a quantum computer. It makes the system the worldwide first prototype of a distributed quantum computer.
    The limitations of previous qubit architectures
    Quantum computers are considerably different from traditional “binary” computers: Future realizations of them are expected to easily perform specific calculations for which traditional computers would take months or even years — for example in the field of data encryption and decryption. While the performance of binary computers results from large memories and fast computing cycles, the success of the quantum computer rests on the fact that one single memory unit — a quantum bit, also called “qubit” — can contain superpositions of different possible values at the same time. Therefore, a quantum computer does not only calculate one result at a time, but instead many possible results in parallel. The more qubits there are interconnected in a quantum computer; the more complex calculations it can perform.
    The basic computing operations of a quantum computer are quantum-logic gates between two qubits. Such an operation changes — depending on the initial state of the qubits — their quantum mechanical states. For a quantum computer to be superior to a normal computer for various calculations, it would have to reliably interconnect many dozens, or even thousands of qubits for equally thousands of quantum operations. Despite great successes, all current laboratories are still struggling to build such a large and reliable quantum computer, since every additionally required qubit makes it much harder to build a quantum computer in just one single set-up. The qubits are implemented, for instance, with single atoms, superconductive elements, or light particles, all of which need to be isolated perfectly from each other and the environment. The more qubits are arranged next to one another, the harder it is to both isolate and control them from outside at the same time.
    Data line and processing unit combined
    One way to overcome the technical difficulties in the construction of quantum computers is presented in a new study in the journal Science by Severin Daiss, Stefan Langenfeld and colleagues from the research group of Gerhard Rempe at the Max Planck Institute of Quantum Optics in Garching. In this work supported by the Institute of Photonic Sciences (Castelldefels, Spain), the team succeeded in connecting two qubit modules across a 60-meter distance in such a way that they effectively form a basic quantum computer with two qubits. “Across this distance, we perform a quantum computing operation between two independent qubit setups in different laboratories,” Daiss emphasizes. This enables the possibility to merge smaller quantum computers to a joint processing unit.
    Simply coupling distant qubits to generate entanglement between them has been achieved in the past, but now, the connection can additionally be used for quantum computations. For this purpose, the researchers employed modules consisting of a single atom as a qubit that is positioned amidst two mirrors. Between these modules, they send one single light quanta, a photon, that is transported in the optical fiber. This photon is then entangled with the quantum states of the qubits in the different modules. Subsequently, the state of one of the qubits is changed according to the measured state of the “ancilla photon,” realizing a quantum mechanical CNOT-operation with a fidelity of 80 percent. A next step would be to connect more than two modules and to host more qubits in the individual modules.
    Higher performance quantum computers through distributed computing
    Team leader and institute director Gerhard Rempe believes the result will allow to further advance the technology: “Our scheme opens up a new development path for distributed quantum computing.” It could enable, for instance, to build a distributed quantum computer consisting of many modules with few qubits that are interconnected with the newly introduced method. This approach could circumvent the limitation of existing quantum computers to integrate more qubits into a single setup and could therefore allow more powerful systems.

    Story Source:
    Materials provided by Max-Planck-Gesellschaft. Note: Content may be edited for style and length. More

  • in

    Most important global supply chain linkages

    In today’s global economy, production of goods depends on inputs from many trade partners around the world. Companies and governments need a deeper understanding of the global value chain to reduce costs, maintain a profitable production system, and anticipate ripple effects of disruptions in the supply chain.
    Applied economists from the University of Illinois have developed a new model for in-depth analysis of global supply chain linkages across countries and industries, providing a rich tool that delivers valuable insights for businesses and policy makers around the world.
    “We live in a time when production processes are very much fragmented. In order to end up with one type of good, a car for example, many inputs are assembled abroad and imported from different places around the world. For instance, a car sold by leading U.S. companies may have anywhere from just 2% to 85% of U.S. and Canadian parts in it,” says Sandy Dall’Erba, professor in the Department of Agricultural and Consumer Economics and director of the Regional Economics Applications Laboratory (REAL) at U of I. Dall’Erba is co-author of the study.
    “Coordination of the entire supply chain system becomes more and more complicated and sensitive to disruptions at any stage throughout the process. If just one element in your supply chain is missing, it will have a ripple effect on the entire industry,” Dall’Erba notes. “An example of this was the global semiconductor shortage that recently forced U.S. automakers to halt production.”
    The researchers started with a widely used economic growth model called shift-share decomposition and expanded its components to include interregional and inter-sectoral linkages. This allows them to identify, for each industrial sector and each country, if the growth of the sector of interest is due to supply chain linkages at the domestic level versus the international level. The latter can be further split between linkages with trade agreement partners (such as NAFTA for the U.S.) and countries from the rest of the world, highlighting the benefits of trade agreements.
    “When we apply our technique to understand the drivers of growth in a particular sector, we not only can say whether it is growing faster or slower than another sector or region, we can also identify other sectors that are important for the growth of this particular sector,” says Claudia Montania, the study’s lead author. Montania was a visiting scholar in REAL when she conducted the study and is currently a researcher at the United Nations Development Accelerator Lab in Asuncion, Paraguay.

    advertisement

    Traditional shift-share decomposition includes information about changes in the industry mix and in region-specific features such as taxes, regulations, or characteristics of the labor force. But it does not include connections among different regions or different industry sectors.
    “The information provided by the traditional shift-share model is not enough,” Dall’Erba notes. “For example, it would be a mistake to study only the food manufacturing sector in order to know what is happening in that sector, because it obviously depends on grain and livestock production which, in turn, depends on water and fertilizers among other inputs.
    “In addition, grains are not always used for food manufacturing but they may end up as fuel. The supply chain of any sector is intertwined with that of many other sectors,” he adds.
    In the paper, Dall’Erba and Montania apply their model to country-sector linkages in the European Union, allowing them to compare three levels of connections — domestic, within the EU, and with the rest of the world, and to identify which ones matter most for each sector. The analysis included 35 industrial sectors in 15 countries from 1995 to 2006.
    Overall, the researchers found the most important linkages were among EU trade partners; the second-most important were domestic ties; and the least important linkages were with the rest of the world. They emphasize the results vary across sectors and countries. For example, the supply-chain linkages in place to manufacture a French car are different from those that exist for a German car. Their multi-dynamic model can provide detailed, specific information for each country-sector combination as needed for preemptive and tailored planning and policy making.
    “Knowing which type of linkages are the most important for your product or your sector can be very useful for local governments, for companies, and for producers, because you can make better plans to achieve the expected growth for your sector,” Montania states. “You can also promote trade and diplomatic relationships in regions where you have strong sectoral linkages.”
    Dall’Erba points out this information can help countries and industries protect against supply chain disruptions. Those can occur in many forms, ranging from natural disasters such as drought or earthquake to political upheaval, trade wars, and even the global pandemic. For instance, the extreme disruption airline companies have experienced as demand for air travel dropped in 2020 means both Boeing and Airbus have significantly reduced their production and so have the multiple companies manufacturing airplane components from fuselage to seat belts.
    “COVID-19 has pushed several governments to consider bringing back some industries in order to get better control over all the supply chain links. However, it is not necessarily a viable option as many companies have already de-located their unskilled labor-intensive production to low-wage countries while maintaining high-skilled workers at home,” Dall’Erba concludes. More

  • in

    Game theory may be useful in explaining and combating viruses

    A team of researchers concludes that a game-theory approach may offer new insights into both the spread and disruption of viruses, such as SARS-CoV-2. Its work, described in the journal Royal Society Interface, applies a “signaling game” to an analysis of cellular processes in illuminating molecular behavior.
    “We need new models and technologies at many levels in order to understand how to tame viral pandemics,” explains Bud Mishra, a professor at NYU’s Courant Institute of Mathematical Sciences and one of the paper’s authors. “At the biomolecular level, we explain how cellularization may be understood in ways that stymie disease and encourage healthy functioning.”
    The analysis, which also included William Casey, an assistant professor in the U.S. Naval Academy’s Cyber Science Department, and Steven Massey, an assistant professor in the Department of Biology at the University of Puerto Rico, centered on the biological and evolutionary phenomenon “mimicry” — organisms changing form to represent another.
    The researchers, in particular, focused on two types of mimicry: “Batesian” and “Muellerian.” Batesian mimicry, named after the naturalist Henry Walter Bates, involves conflict or deception between the sender and receiver — for example, a harmless hoverfly mimics a more dangerous wasp in order to deter predators. By contrast, Muellerian mimicry, named after the zoologist and naturalist Johann Friedrich Theodor Mueller, occurs when there is a common interest between the sender and receiver — for instance, two species that adopt each other’s warning signals as a means to offer protection for both.
    These types of mimicry also occur at the molecular level.
    “The gene for an RNA or a protein macro-molecule can be considered as the sender, while the signal consists of the three-dimensional conformation of the expressed gene product,” write the authors. “The receiver is the macro-molecule, which specifically interacts with the signal macro-molecule, typically a protein, but could also be an RNA or DNA molecule.”
    The SARS-CoV-2 virus, they add, makes multiple uses of molecular mimicry in its efforts to exploit its human host by mimicking, in Batesian fashion, healthy cells in order to infect the host organism. By contrast, vaccines deceive the human immune system into sensing that it is being attacked by a virus. While this deception is costly to the vaccinated subject in the short term — in the form of reactions to the injection — the immune system retains a memory and so is pre-prepared for a future encounter with the real virus.
    This dynamic plays out annually in the creation of flu shots — vaccines are altered each year in order to accurately mimic a newly evolved flu virus.
    With this in mind, the researchers sought to determine if a signaling game could provide a framework for analyzing the different types of mimicry. Under a signaling game, a sender aims to persuade the receiver that it carries a message that benefits both — independent of the veracity of the claim.
    In their analysis, the paper’s authors constructed a mathematical model that mapped out a series of signaling strategies that, theoretically, could be adopted by both a virus (Batesian mimicry) and a vaccine (Mullerian mimicry). Their results offered a range of blueprints of how mimicry is formed, maintained, and destroyed in cellular populations.
    “Better knowledge of the deceptive strategies of SARS-CoV-2 will help to inform vaccine design,” the researchers conclude.
    The research was supported by the Office of Naval Research (N0001420WX01716), a National Cancer Institute Physical Sciences-Oncology Center grant (U54 CA193313-01), and a U.S. Army grant (W911NF1810427).

    Story Source:
    Materials provided by New York University. Note: Content may be edited for style and length. More

  • in

    Machine learning aids in simulating dynamics of interacting atoms

    A revolutionary machine-learning (ML) approach to simulate the motions of atoms in materials such as aluminum is described in this week’s Nature Communications journal. This automated approach to “interatomic potential development” could transform the field of computational materials discovery.
    “This approach promises to be an important building block for the study of materials damage and aging from first principles,” said project lead Justin Smith of Los Alamos National Laboratory. “Simulating the dynamics of interacting atoms is a cornerstone of understanding and developing new materials. Machine learning methods are providing computational scientists new tools to accurately and efficiently conduct these atomistic simulations. Machine learning models like this are designed to emulate the results of highly accurate quantum simulations, at a small fraction of the computational cost.”
    To maximize the general accuracy of these machine learning models, he said, it is essential to design a highly diverse dataset from which to train the model. A challenge is that it is not obvious, a priori, what training data will be most needed by the ML model. The team’s recent work presents an automated “active learning” methodology for iteratively building a training dataset.
    At each iteration, the method uses the current-best machine learning model to perform atomistic simulations; when new physical situations are encountered that are beyond the ML model’s knowledge, new reference data is collected via expensive quantum simulations, and the ML model is retrained. Through this process, the active learning procedure collects data regarding many different types of atomic configurations, including a variety of crystal structures, and a variety of defect patterns appearing within crystals.

    Story Source:
    Materials provided by DOE/Los Alamos National Laboratory. Note: Content may be edited for style and length. More

  • in

    Measuring hemoglobin levels with AI microscope, microfluidic chips

    One of the most performed medical diagnostic tests to ascertain the health of patients is a complete blood count, which typically includes an estimate of the hemoglobin concentration. The hemoglobin level in the blood is an important biochemical parameter that can indicate a host of medical conditions including anemia, polycythemia, and pulmonary fibrosis.
    In AIP Advances, by AIP Publishing, researchers from SigTuple Technologies and the Indian Institute of Science describe a new AI-powered imaging-based tool to estimate hemoglobin levels. The setup was developed in conjunction with a microfluidic chip and an AI-powered automated microscope that was designed for deriving the total as well as differential counts of blood cells.
    Often, medical diagnostics equipment capable of multiparameter assessment, such as hematology analyzers, has dedicated subcompartments with separate optical detection systems. This leads to increased sample volume as well as an increase in cost of the entire equipment.
    “In this study, we demonstrate that the applicability of a system originally designed for the purposes of imaging can be extended towards the performance of biochemical tests without any additional modifications to the hardware unit, thereby retraining the cost and laboratory footprint of the original device,” said author Srinivasan Kandaswamy.
    The hemoglobin testing solution is possible thanks to the design behind the microfluidic chip, a customized biochemical reagent, optimized imaging, and an image analysis procedure specifically tailored to enable the good clinical performance of the medical diagnostic test.
    The data obtained from the microfluidic chip in combination with an automated microscope was comparable with the predictions of hematology analyzers (Pearson correlation of 0.99). The validation study showed the method meets regulatory standards, which means doctors and hospitals are likely to accept it.
    The automated microscope, which normally uses a combination of red, green, and blue LEDs, used only the green LED during the hemoglobin estimation mode, because the optimized reagent (SDS-HB) complex absorbs light in the green wavelength.
    Chip-based, microfluidic, diagnostic platforms are on the verge of revolutionizing the field of health care and colorimetric biochemical assays are widely performed diagnostic tests.
    “This paper lays the foundation and will also serve as a guide to future attempts to translate conventional biochemical assays onto a chip, from point of view of both chip design and reagent development,” said Kandaswamy.
    Besides measuring hemoglobin in the blood, a similar setup with minor modifications could be used to measure protein content, cholesterol, and glycated hemoglobin.

    Story Source:
    Materials provided by American Institute of Physics. Note: Content may be edited for style and length. More

  • in

    Environmental policies not always bad for business, study finds

    Critics claim environmental regulations hurt productivity and profits, but the reality is more nuanced, according to an analysis of environmental policies in China by a pair of Cornell economists.
    The analysis found that, contrary to conventional wisdom, market-based or incentive-based policies may actually benefit regulated firms in the traditional and “green” energy sectors, by spurring innovation and improvements in production processes. Policies that mandate environmental standards and technologies, on the other hand, may broadly harm output and profits.
    “The conventional wisdom is not entirely accurate,” said Shuyang Si, a doctoral student in applied economics and management. “The type of policy matters, and policy effects vary by firm, industry and sector.”
    Si is the lead author of “The Effects of Environmental Policies in China on GDP, Output, and Profits,” published in the current issue of the journal Energy Economics. C.-Y. Cynthia Lin Lawell, associate professor in the Charles H. Dyson School of Applied Economics and Management and the Robert Dyson Sesquicentennial Chair in Environmental, Energy and Resource Economics, is a co-author.
    Si mined Chinese provincial government websites and other online sources to compile a comprehensive data set of nearly 2,700 environmental laws and regulations in effect in at least one of 30 provinces between 2002 and 2013. This period came just before China declared a “war on pollution,” instituting major regulatory changes that shifted its longtime prioritization of economic growth over environmental concerns.
    “We really looked deep into the policies and carefully examined their features and provisions,” Si said.

    advertisement

    The researchers categorized each policy as one of four types: “command and control,” such as mandates to use a portion of electricity from renewable sources; financial incentives, including taxes, subsidies and loans; monetary awards for cutting pollution or improving efficiency and technology; and nonmonetary awards, such as public recognition.
    They assessed how each type of policy impacted China’s gross domestic product, industrial output in traditional energy industries and the profits of new energy sector companies, using publicly available data on economic indicators and publicly traded companies.
    Command and control policies and nonmonetary award policies had significant negative effects on GDP, output and profits, Si and Lin Lawell concluded. But a financial incentive — loans for increasing renewable energy consumption — improved industrial output in the petroleum and nuclear energy industries, and monetary awards for reducing pollution boosted new energy sector profits.
    “Environmental policies do not necessarily lead to a decrease in output or profits,” the researchers wrote.
    That finding, they said, is consistent with the “Porter hypothesis” — Harvard Business School Professor Michael Porter’s 1991 proposal that environmental policies could stimulate growth and development, by spurring technology and business innovation to reduce both pollution and costs.
    While certain policies benefited regulated firms and industries, the study found that those benefits came at a cost to other sectors and to the overall economy. Nevertheless, Si and Lin Lawell said, these costs should be weighed against the benefits of these policies to the environment and society, and to the regulated firms and industries.
    Economists generally prefer market-based or incentive-based environmental policies, Lin Lawell said, with a carbon tax or tradeable permit system representing the gold standard. The new study led by Si, she said, provides more support for those types of policies.
    “This work will make people aware, including firms that may be opposed to environmental regulation, that it’s not necessarily the case that these regulations will be harmful to their profits and productivity,” Lin Lawell said. “In fact, if policies promoting environmental protection are designed carefully, there are some that these firms might actually like.”
    Additional co-authors contributing to the study were Mingjie Lyu of Shanghai Lixin University of Accounting and Finance, and Song Chen of Tongji University. The authors acknowledged financial support from the Shanghai Science and Technology Development Fund and an Exxon-Mobil ITS-Davis Corporate Affiliate Fellowship. More