More stories

  • in

    Artificial Intelligence to identify individual birds of same species

    Humans have a hard time identifying individual birds just by looking at the patterns on their plumage. An international study involving scientists form the CNRS, Université de Montpellier and the University of Porto in Portugal, among others, has shown how computers can learn to differentiate individual birds of a same species. The results are published on 27 July 2020 in Methods in Ecology and Evolution.
    Differentiating between individuals of a same species is essential in the study of wild animals, their processes of adaptation and behaviour. Scientists from the CEFE research centre in Ecology and Evolutionary Ecology (CNRS/ Université de Montpellier/ Université Paul-Valéry-Montpellier/ IRD/ EPHE) and the Research Centre in Biodiversity and Genetic Resources (CIBIO) at Porto University have for the very first time identified individual birds with the help of artificial intelligence technology.
    They have developed a technique that enables them to gather a large number of photographs, taken from various angles, of individual birds wearing electronic tags. These images were fed into computers which used deep learning technology to recognise the birds by analysing the photographs. The computers were able to distinguish individual birds according to the patterns on their plumage, something humans can’t do. The technology was able to identify specimens from populations of three different species: sociable weavers, great tits and zebra finches.
    This new technique could not only result in a less invasive method of identification but also lead to new insights in ecology, for example, by opening ways of using AI to study animal behaviour in the wild.

    Story Source:
    Materials provided by CNRS. Note: Content may be edited for style and length. More

  • in

    COVID-19 increased anxiety, depression for already stressed college students

    College students were more anxious and depressed during the initial outbreak of COVID-19 than they were during similar time frames in previous academic years, according to a Dartmouth study.
    The research also found that sedentary behavior increased dramatically during the onset of the public health crisis in early March.
    The study, published in the Journal of Medical Internet Research, used a mix of smartphone sensing and digital questionnaires from more than 200 students participating in a research program that is tracking mental health throughout their undergraduate years.
    “COVID-19 had an immediate negative impact on the emotional well-being of the college students we studied,” said Jeremy Huckins, a lecturer on psychological and brain sciences at Dartmouth. “We observed a large-scale shift in mental health and behavior compared to the observed baseline established for this group over previous years.”
    Self-reported symptoms of depression and anxiety within the student research group spiked noticeably at the onset of COVID-19. At the time, major policy changes related to COVID-19 were also being put in place, including the request that students leave campus and the switch to remote learning.
    These changes coincided with the end of classes and final exams, already one of the most stressful times for students in any academic term.

    advertisement

    According to the study, anxiety and depression decreased slightly after the final exam period as students settled into shelter-in-place locations. This suggested some resilience in the face of COVID-19, but levels remained consistently higher than similar periods during previous academic terms.
    Unlike previous terms studied, sedentary time increased dramatically during this year’s spring break period.
    “This was an atypical time for these college students. While spring break is usually a period of decreased stress and increased physical activity, spring break 2020 was stressful and confining for the students participating in this study. We suspect that this was the case for a large number of college students across the country,” said Huckins.
    The study used StudentLife, a sensing app developed at Dartmouth, to collect information from student volunteers. StudentLife passively collects behavioral information from user’s smartphones such as duration of phone usage, number of phone unlocks, sleep duration, and sedentary time.
    Data on depression and anxiety were collected using weekly, self-reported assessments also administered through the StudentLife app.

    advertisement

    “This is the first time we have used sensor data from phones to give us unique behavioral insights into the reaction of students to the onset of the pandemic on a college campus,” said Andrew Campbell, the Albert Bradley 1915 Third Century Professor of computer science at Dartmouth and one of the lead researchers of the StudentLife study. “We plan to further analyze how these students adjusted both physically and mentally during remote learning that leads on from this study.”
    In the research, the team also reported a connection between anxiety and COVID-19 news coverage. The link between depression and news reporting was apparent, but not as strong. As news coverage intensified, there was an increase in sedentary behavior and a longer duration of phone usage.
    According to the study, the decrease in the number of locations visited was consistent with the social distancing and shelter-in-place policies implemented by local governments.
    The study’s findings on the uptake of social distancing recommendations contrasts with other research of college students in which governmental social distancing policies were not followed. Findings in the current study are also contrary to media depictions of college-age students flouting social distancing recommendations during the spring break period.
    “Many people wouldn’t expect college students to listen to social distancing orders, but these students did. We found that when social distancing was recommended by local governments, students were more sedentary and visited fewer locations on any given day,” said Huckins. “Clearly the impact of COVID-19 extends beyond the virus and its direct impacts. An unresolved question is if mental health and physical activity will continue to degrade over time, or if we will see a recovery, and how long that recovery will take.”
    The research is part of a multiyear study focusing on the mental health of undergraduate students as they progress through their undergraduate careers. The complete study combines smartphone mobile sensing with functional neuroimaging.
    “When we set out two years ago to follow 200 students across their college experiences, we could never have anticipated the inflection point in our data as a result of such a catastrophic event as the pandemic,” added Campbell.
    Upon completion of the full study, researchers will be able to extend their findings on the disruption at the start of the COVID-19 pandemic to the long-term impact of remote learning and social isolation that the students are experiencing.
    More information on the StudentLife research program can be found at: https://studentlife.cs.dartmouth.edu More

  • in

    Soft robot actuators heal themselves

    Repeated activity wears on soft robotic actuators, but these machine’s moving parts need to be reliable and easily fixed. Now a team of researchers has a biosynthetic polymer, patterned after squid ring teeth, that is self-healing and biodegradable, creating a material not only good for actuators, but also for hazmat suits and other applications where tiny holes could cause a danger.
    “Current self-healing materials have shortcomings that limit their practical application, such as low healing strength and long healing times (hours),” the researcher report in today’s issue of Nature Materials.
    The researchers produced high-strength synthetic proteins that mimic those found in nature. Like the creatures they are patterned on, the proteins can self-heal both minute and visible damage.
    “Our goal is to create self-healing programmable materials with unprecedented control over their physical properties using synthetic biology,” said Melik Demirel, professor of engineering science and mechanics and holder of the Lloyd and Dorothy Foehr Huck Chair in Biomimetic Materials.
    Robotic machines from industrial robotic arms and prosthetic legs have joints that move and require a soft material that will accommodate this movement. So do ventilators and personal protective equipment of various kinds. But, all materials under continual repetitive motion develop tiny tears and cracks and eventually break. Using a self-healing material, the initial tiny defects are repairable before catastrophic failure ensues.
    Demirel’s team creates the self-healing polymer by using a series of DNA tandem repeats made up of amino acids produced by gene duplication. Tandem repeats are usually short series of molecules arranged to repeat themselves any number of times. The researchers manufacture the polymer in standard bacterial bioreactors.

    advertisement

    “We were able to reduce a typical 24-hour healing period to one second so our protein-based soft robots can now repair themselves immediately,” said Abdon Pena-Francelsch, lead author of the paper and a former doctoral student in Demirel’s lab. “In nature, self-healing takes a long time. In this sense, our technology outsmarts nature.”
    The self-healing polymer heals with the application of water and heat, although Demirel said that it could also heal using light.
    “If you cut this polymer in half, when it heals it gains back 100 percent of its strength,” said Demirel.
    Metin Sitti, director, Physical Intelligence Department at the Max Planck Institute for Intelligent Systems, Stuttgart, Germany, and his team, were working with the polymer, creating holes and healing them. They then created soft actuators that, through use, cracked and then healed in real time — about one second.
    “Self-repairing physically intelligent soft materials are essential for building robust and fault-tolerant soft robots and actuators in the near future,” said Sitti.

    advertisement

    By adjusting the number of tandem repeats, Demirel’s team created a soft polymer that healed rapidly and retained its original strength, but they also created a polymer that is 100% biodegradable and 100% recyclable into the same, original polymer.
    “We want to minimize the use of petroleum-based polymers for many reasons,” said Demirel. “Sooner or later we will run out of petroleum and it is also polluting and causing global warming. We can’t compete with the really inexpensive plastics. The only way to compete is to supply something the petroleum based polymers can’t deliver and self-healing provides the performance needed.”
    Demirel explained that while many petroleum-based polymers can be recycled, they are recycled into something different. For example, polyester t-shirts can be recycled into bottles, but not into polyester fibers again.
    Just as the squid the polymer mimics biodegrades in the ocean, the biomimetic polymer will biodegrade. With the addition of an acid like vinegar, the polymer will also recycle into a powder that is again manufacturable into the same, soft, self-healing polymer.
    “This research illuminates the landscape of material properties that become accessible by going beyond proteins that exist in nature using synthetic biology approaches,” said Stephanie McElhinny, biochemistry program manager, Army Research Office, an element of the U.S. Army Combat Capabilities Development Command’s Army Research Laboratory. “The rapid and high-strength self-healing of these synthetic proteins demonstrates the potential of this approach to deliver novel materials for future Army applications, such as personal protective equipment or flexible robots that could maneuver in confined spaces.” More

  • in

    More realistic computer graphics

    Researchers at Dartmouth, in collaboration with industry partners, have developed software techniques that make lighting in computer-generated images look more realistic. The research will be presented at the upcoming ACM SIGGRAPH conference, the premier venue for research in computer graphics.
    The new techniques focus on “real time” graphics which need to maintain the illusion of interactivity as scenes change in response to user moves. These graphics can be used in applications such as video games, extended reality, and scientific visualization tools.
    Both papers demonstrate how developers can create sophisticated lighting effects by adapting a popular rendering technique known as ray tracing.
    “Over the last decade, ray tracing has dramatically increased the realism and visual richness of computer-generated images in movies where producing just a single frame can take hours,” said Wojciech Jarosz, an associate professor of computer science at Dartmouth who served as the senior researcher for both projects. “Our papers describe two very different approaches for bringing realistic ray-traced lighting to the constraints of real time graphics.”
    The first project, developed with NVIDIA, envisions the possibilities for future games once developers incorporate NVIDIA’s hardware-accelerated RTX ray tracing platform. Recent games have started to use RTX for physically correct shadows and reflections, but quality and complexity of lighting is currently limited by the small number of rays that can be traced per frame.
    The new technique, called reservoir-based spatiotemporal importance resampling (ReSTIR), creates realistic lighting and shadows from millions of artificial light sources. The ReSTIR approach dramatically increases the quality of rendering on a computer’s graphics card by reusing rays that were traced in neighboring pixels and in prior frames.

    advertisement

    The new technique can be integrated into the design of future games and works up to 65 times faster than previous rendering techniques.
    “This technology is not just exciting for what it can bring to real-time applications like games, but also its impact in the movie industry and beyond,” said Benedikt Bitterli, a PhD student at Dartmouth who served as the first author of a research paper on the technique.
    The second project, conducted in collaboration with Activision, describes how the video game publisher has incorporated increasingly realistic lighting effects into its games.
    Traditionally, video games create lighting sequences in real time using what are called “baked” solutions: the complex ray-traced illumination is computed only once through a time-consuming process. The lighting created using this technique can be displayed easily during gameplay, but it is constrained to assuming a fixed configuration for a scene. As a result, the lighting cannot easily react to the movement of characters and cameras.
    The research paper describes how Activision gradually evolved its “UberBake” system from the static approach to one which can depict subtle lighting changes in response to player interactions, such as turning lights on and off, or opening and closing doors.

    advertisement

    Since UberBake was developed over many years to work on current games, it needed to work on a variety of existing hardware, ranging from high-end PCs to previous-generation gaming consoles.
    “Video games are used by millions of people around the world,” said Dario Seyb, a PhD student at Dartmouth who served as the research paper’s co-first author. “With so many people interacting with video games, this technology can have a huge impact.”
    Dartmouth researchers on both projects are affiliated with the Dartmouth Visual Computing Lab.
    “These industry collaborations have been fantastic. They allow our students to work on foundational academic research informed by practical problems in industry, allowing the work to have a more immediate, real-world impact,” said Jarosz.
    The research papers will be published in ACM Transactions on Graphics and presented at SIGGRAPH 2020 taking place online during the summer.

    Story Source:
    Materials provided by Dartmouth College. Note: Content may be edited for style and length. More

  • in

    If relaxed too soon, physical distancing measures might have been all for naught

    If physical distancing measures in the United States are relaxed while there is still no COVID-19 vaccine or treatment and while personal protective equipment remains in short supply, the number of resulting infections could be about the same as if distancing had never been implemented to begin with, according to a UCLA-led team of mathematicians and scientists.
    The researchers compared the results of three related mathematical models of disease transmission that they used to analyze data emerging from local and national governments, including one that measures the dynamic reproduction number — the average number of susceptible people infected by one previously infected person. The models all highlight the dangers of relaxing public health measures too soon.
    “Distancing efforts that appear to have succeeded in the short term may have little impact on the total number of infections expected over the course of the pandemic,” said lead author Andrea Bertozzi, a distinguished professor of mathematics who holds UCLA’s Betsy Wood Knapp Chair for Innovation and Creativity. “Our mathematical models demonstrate that relaxing these measures in the absence of pharmaceutical interventions may allow the pandemic to reemerge. It’s about reducing contact with other people, and this can be done with PPE as well as distancing.”
    The study is published in the journal Proceedings of the National Academy of Sciences and is applicable to both future spikes of COVID-19 and future pandemics, the researchers say.
    If distancing and shelter-in-place measures had not been taken in March and April, it is very likely the number of people infected in California, New York and elsewhere would have been dramatically higher, posing a severe burden on hospitals, Bertozzi said. But the total number of infections predicted if these precautions end too soon is similar to the number that would be expected over the course of the pandemic without such measures, she said. In other words, short-term distancing can slow the spread of the disease but may not result in fewer people becoming infected.
    Mathematically modeling and forecasting the spread of COVID-19 are critical for effective public health policy, but wide differences in precautionary approaches across the country have made it a challenge, said Bertozzi, who is also a distinguished professor of mechanical and aerospace engineering. Social distancing and wearing face masks reduce the spread of COVID-19, but people in many states are not following distancing guidelines and are not wearing masks — and the number of infections continues to rise.

    advertisement

    What are the implications of these findings for policymakers who want to relax social distancing in an effort to revive their economies?
    “Policymakers need to be careful,” Bertozzi said. “Our study predicts a surge in cases in California after distancing measures are relaxed. Alternative strategies exist that would allow the economy to ramp up without substantial new infections. Those strategies all involve significant use of PPE and increased testing.”
    During the 1918 influenza pandemic, social distancing was first enforced and then relaxed in some areas. Bertozzi points to a study published in Proceedings of the National Academy of Sciences in 2007 that looked at several American cities during that pandemic where a second wave of infections occurred after public health measures were removed too early.
    That study found that the timing of public health interventions had a profound influence on the pattern of the second wave of the 1918 pandemic in different cities. Cities that had introduced measures early in the pandemic achieved significant reductions in overall mortality. Larger reductions in peak mortality were achieved by those cities that extended the public health measures for longer. San Francisco, St. Louis, Milwaukee and Kansas City, for instance, had the most effective interventions, reducing transmission rates by 30% to 50%.
    “Researchers Martin Bootsma and Neil Ferguson were able to analyze the effectiveness of distancing measures by comparing the data against an estimate for what might have happened had distancing measures not been introduced,” Bertozzi said of the 2007 study. “They considered data from the full pandemic, while we addressed the question of fitting models to early-time data for this pandemic. During the 1918 influenza pandemic, the early relaxation of social distancing measures led to a swift uptick in deaths in some U.S. cities. Our mathematical models help to explain why this effect might occur today.”
    The COVID-19 data in the new study are from April 1, 2020, and are publicly available. The study is aimed at scientists who are not experts in epidemiology.
    “Epidemiologists are in high demand during a pandemic, and public health officials from local jurisdictions may have a need for help interpreting data,” Bertozzi said. “Scientists with relevant background can be tapped to assist these people.”
    Study co-authors are Elisa Franco, a UCLA associate professor of mechanical and aerospace engineering and bioengineering; George Mohler, an associate professor of computer and information science at Indiana University-Purdue University Indianapolis; Martin Short, an associate professor of mathematics at Georgia Tech; and Daniel Sledge, an associate professor of political science at the University of Texas at Arlington. More

  • in

    Researchers develop a method for predicting unprecedented events

    A black swan event is a highly unlikely but massively consequential incident, such as the 2008 global recession and the loss of one-third of the world’s saiga antelope in a matter of days in 2015. Challenging the quintessentially unpredictable nature of black swan events, bioengineers at Stanford University are suggesting a method for forecasting these supposedly unforeseeable fluctuations.
    “By analyzing long-term data from three ecosystems, we were able to show that fluctuations that happen in different biological species are statistically the same across different ecosystems,” said Samuel Bray, a research assistant in the lab of Bo Wang, assistant professor of bioengineering at Stanford. “That suggests there are certain underlying universal processes that we can take advantage of in order to forecast this kind of extreme behavior.”
    The forecasting method the researchers have developed, which was detailed recently in PLOS Computational Biology, is based on natural systems and could find use in health care and environmental research. It also has potential applications in disciplines outside ecology that have their own black swan events, such as economics and politics.
    “This work is exciting because it’s a chance to take the knowledge and the computational tools that we’re building in the lab and use those to better understand — even predict or forecast — what happens in the world surrounding us,” said Wang, who is senior author of the paper. “It connects us to the bigger world.”
    From microbes to avalanches
    Over years of studying microbial communities, Bray noticed several instances where one species would undergo an unanticipated population boom, overtaking its neighbors. Discussing these events with Wang, they wondered whether this phenomenon occurred outside the lab as well and, if so, whether it could be predicted.

    advertisement

    In order to address this question, the researchers had to find other biological systems that experience black swan events. The researchers needed details, not only about the black swan events themselves but also the context in which they occurred. So, they specifically sought ecosystems that scientists have been closely monitoring for many years.
    “These data have to capture long periods of time and that’s hard to collect,” said Bray, who is lead author of the paper. “It’s much more than a PhD-worth of information. But that’s the only way you can see the spectra of these fluctuations at large scales.”
    Bray settled on three eclectic datasets: an eight-year study of plankton from the Baltic Sea with species levels measured twice weekly; net carbon measurements from a deciduous broadleaf forest at Harvard University, gathered every 30 minutes since 1991; and measurements of barnacles, algae and mussels on the coast of New Zealand, taken monthly for over 20 years.
    The researchers then analyzed these three datasets using theory about avalanches — physical fluctuations that, like black swan events, exhibit short-term, sudden, extreme behavior. At its core, this theory attempts to explain the physics of systems like avalanches, earthquakes, fire embers, or even crumpling candy wrappers, which all respond to external forces with discrete events of various magnitudes or sizes — a phenomenon scientists call “crackling noise.”
    Built on the analysis, the researchers developed a method for predicting black swan events, one that is designed to be flexible across species and timespans, and able to work with data that are far less detailed and more complex than those used to develop it.

    advertisement

    “Existing methods rely on what we have seen to predict what might happen in the future, and that’s why they tend to miss black swan events,” said Wang. “But Sam’s method is different in that it assumes we are only seeing part of the world. It extrapolates a little about what we’re missing, and it turns out that helps tremendously in terms of prediction.”
    Forecasting in the real world
    The researchers tested their method using the three ecosystem datasets on which it was built. Using only fragments of each dataset — specifically fragments which contained the smallest fluctuations in the variable of interest — they were able to accurately predict extreme events that occurred in those systems.
    They would like to expand the application of their method to other systems in which black swan events are also present, such as in economics, epidemiology, politics and physics. At present, the researchers are hoping to collaborate with field scientists and ecologists to apply their method to real-world situations where they could make a positive difference in the lives of other people and the planet.
    This research was funded by the Volkswagen Foundation and Arnold and Mabel Beckman Foundation. Wang is also a member of Stanford Bio-X and the Wu Tsai Neurosciences Institute. More

  • in

    A new MXene material shows extraordinary electromagnetic interference shielding ability

    As we welcome wireless technology into more areas of life, the additional electronic bustle is making for an electromagnetically noisy neighborhood. In hopes of limiting the extra traffic, researchers at Drexel University have been testing two-dimensional materials known for their interference-blocking abilities. Their latest discovery, reported in the journal Science, is of the exceptional shielding ability of a new two-dimensional material that can absorb electromagnetic interference rather than just deflecting back into the fray.
    The material, called titanium carbonitride, is part of a family of two-dimensional materials, called MXenes, that were first produced at Drexel in 2011. Researchers have revealed that these materials have a number of exceptional properties, including impressive strength, high electrical conductivity and molecular filtration abilities. Titanium carbonitride’s exceptional trait is that it can block and absorb electromagnetic interference more effectively than any known material, including the metal foils currently used in most electronic devices.
    “This discovery breaks all the barriers that existed in the electromagnetic shielding field. It not only reveals a shielding material that works better than copper, but it also shows an exciting, new physics emerging, as we see discrete two-dimensional materials interact with electromagnetic radiation in a different way than bulk metals,” said Yury Gogotsi, PhD, Distinguished University and Bach professor in Drexel’s College of Engineering, who led the research group that made this MXene discovery, which also included scientists from the Korea Institute of Science and Technology, and students from Drexel’s co-op partnership with the Institute.
    While electromagnetic interference — “EMI” to engineers and technologists — is noticed only infrequently by the users of technology, likely as a buzzing noise from a microphone or speaker, it is a constant concern for the engineers who design it. The things that EMI is interfering with are other electrical components, such as antennas and circuitry. It diminishes electrical performance, can slow data exchange rates and can even interrupt the function of devices.
    Electronics designers and engineers tend to use shielding materials to contain and deflect EMI in devices, either by covering the entire circuit board with a copper cage, or, more recently by wrapping individual components in foil shielding. But both of these strategies add bulk and weight to the devices.
    Gogotsi’s group discovered that its MXene materials, which are much thinner and lighter than copper, can be quite effective at EMI shielding. Their findings, reported in Science four years ago, indicated that a MXene called titanium carbide showed the potential to be as effective as the industry-standard materials at the time, and it could be easily applied as a coating. This research quickly became one of the most impactful discoveries in the field and inspired other researchers to look at other materials for EMI shielding.

    advertisement

    But as the Drexel and KIST teams continued to inspect other members of the family for this application, they uncovered the unique qualities of titanium carbonitride that make it an even more promising candidate for EMI shielding applications.
    “Titanium carbonitride has a very similar structure by comparison to titanium carbide — they’re actually identical aside from one replacing half of its carbon atoms with nitrogen atoms — but titanium carbonitride is about an order of magnitude less conductive,” said Kanit Hantanasirisakul, a doctoral candidate in Drexel’s Department of Materials Science and Engineering. “So we wanted to gain a fundamental understanding of the effects of conductivity and elemental composition on EMI shielding application.”
    Through a series of tests, the group made a startling discovery. Namely, that a film of the titanium carbonitride material -many times thinner than the thickness of a strand of human hair — could actually block EMI interference about 3-5 times more effectively than a similar thickness of copper foil, which is typically used in electronic devices.
    “It’s important to note that we didn’t initially expect the titanium carbonitride MXene to be better compared to the most conductive of all MXenes known: titanium carbide,” Hantanasirisakul said. “We first thought there might be something wrong with the measurements or the calculations. So, we repeated experiments over and over again to make sure we did everything correctly and the values were reproducible.”
    Perhaps more significant than the team’s discovery of the material’s shielding prowess is their new understanding of the way it works. Most EMI shielding materials simply prevent the penetration of the electromagnetic waves by reflecting it away. While this is effective for protecting components, it doesn’t alleviate the overall problem of EMI propagation in the environment. Gogotsi’s group found that titanium carbonitride actually blocks EMI by absorbing the electromagnetic waves.

    advertisement

    “This is a much more sustainable way to handle electromagnetic pollution than simply reflecting waves that can still damage other devices that are not shielded,” Hantanasirisakul said. “We found that most of the waves are absorbed by the layered carbonitride MXene films. It’s like the difference between kicking litter out of your way or picking it up — this is ultimately a much better solution.”
    This also means that titanium carbonitride could be used to individually coat components inside a device to contain their EMI even while they are being placed closely together. Companies like Apple have been trying this containment strategy for several years, but with success limited by the thickness of the copper foil. As devices designers strive to make them ubiquitous by making them smaller, less noticeable and more integrated, this strategy is likely to become the new norm.
    The researchers suspect that titanium carbonitride’s uniqueness is due to its layered, porous structure, which allows EMI to partially penetrate the material, and its chemical composition, which traps and dissipates the EMI. This combination of characteristics emerges within the material when it is heated in a final step of formation, called annealing.
    “It was a counterintuitive finding. EMI shielding effectiveness typically increases with electrical conductivity. We knew that heat treatment can increase conductivity, so we tried that with the titanium carbonitride to see if it would improve its shielding ability. What we discovered is that it only marginally improved its conductivity, but vastly boosted its shielding effectiveness,” Gogotsi said. “This work motivates us, and should motivate others in the field, to look into properties and applications of other MXenes, as they may show even better performance despite being less electrically conductive.”
    The Drexel team has been expanding its scope and has already examined EMI shielding capabilities of 16 different MXene materials — about half of all MXenes produced in its lab. It plans to continue its investigation of titanium carbonitride to better understand its unique electromagnetic behavior, in hope of predicting hidden abilities in other materials.
    In addition to Gogotsi and Hantanasirisakul, Aamir Iqbal, Faisal Shahzad, Myung-Ki Kim, Hisung Kwon, Junpyo Hong, Hyerim Kim, Daesin Kim and Chong Min Koo; researchers from Korea Institute of Science and Technology (KIST) contributed to this research. More

  • in

    How a few negative online reviews early on can hurt a restaurant

    Just a few negative online restaurant reviews can determine early on how many reviews a restaurant receives long-term, a new study has found.
    The study, published online earlier this month in the journal Papers in Applied Geography, also found that a neighborhood’s median household income affected whether restaurants were rated at all.
    “These online platforms advertise themselves as being unbiased, but we found that that is not the case,” said Yasuyuki Motoyama, lead author of the paper and an assistant professor of city and regional planning at The Ohio State University.
    “The way these platforms work, popular restaurants get even more popular, and restaurants with some initial low ratings can stagnate.”
    The study evaluated reviews in Franklin County, Ohio, from the websites Yelp and Tripadvisor of about 3,000 restaurants per website. Franklin County, home to Columbus and Ohio State, is also home to the headquarters of more than 20 restaurant chains. Previous research has found that the food industry considers consumer preferences in the area to be a litmus test for the broader U.S. market.
    The researchers collected reviews for restaurants published in May 2019, then analyzed those reviews by rating and geographic location. They considered demographics for each neighborhood, and noted the socioeconomics of each neighborhood, too, based on household income.

    advertisement

    The study found that restaurants with a smaller number of reviews on sites like Yelp and TripAdvisor had higher likelihood of a low rating.
    “The more reviews a restaurant received, the higher the average rating of the restaurant,” said Kareem Usher, co-author of the paper and an assistant professor of city and regional planning at Ohio State. “But this has implications: If one of the first reviews a restaurant receives comes from a dissatisfied customer, and people check that later and think ‘I don’t want to go there’ based on that one review, then there will be fewer reviews of that restaurant.”
    The opposite is true for restaurants that receive positive reviews or a large number of reviews: More people are likely to review those restaurants, improving the likelihood that a restaurant’s average rating will be higher.
    The study found that 17.6 percent of restaurants with only one to four reviews received a low rating on Yelp. But that decreased to 9.3 percent for those with between five and 10 reviews. On Tripadvisor, those with one to four reviews had a 5.6 percent probability of having a poor review, going down to 0.6 percent for those with five to 10 reviews.
    Researchers also found that restaurants in several of the poorest neighborhoods in Franklin County tend not to be rated on the sites. However, the researchers did not find a direct link between a neighborhood’s socioeconomics or racial makeup and the average rating of the restaurants there.
    Motoyama cautioned that the study had some limits: It was conducted in one county, and future work could expand to include other areas around the country. The high level multivariate analysis could only use the Yelp data, as the majority of key information in Tripadvisor was missing. The researchers also did not analyze the content of the reviews, which could offer additional clues about bias.
    But, he said, the study does indicate that online review sites can have significant effects on a restaurant’s success or failure — and suggests that, perhaps, the sites can set up policies that might be more fair.
    “Maybe these online platforms can withhold reviews until a restaurant gets a certain number of reviews — say, 10 or more,” he said. “That way if there are two or three customers who are very dissatisfied with a particular experience, they are not directing the restaurant’s success or failure.”

    Story Source:
    Materials provided by Ohio State University. Original written by Laura Arenschield. Note: Content may be edited for style and length. More