More stories

  • in

    Predicting your personality from your smartphone data

    Everyone who uses a smartphone unavoidably generates masses of digital data that are accessible to others, and these data provide clues to the user’s personality. Psychologists at Ludwig-Maximilians-Universitaet in Munich (LMU) are studying how revealing these clues are.
    For most people around the world, smartphones have become an integral and indispensable component of their daily lives. The digital data that these devices incessantly collect are a veritable goldmine — not only for the five largest American IT companies, who make use of them for advertising purposes. They are also of considerable interest in other contexts. For instance, computational social scientists utilize smartphone data in order to learn more about personality traits and social behavior. In a study that appears in the journal PNAS, a team of researchers led by LMU psychologist Markus Bühner set out to determine whether conventional data passively collected by smartphones (such as times or frequencies of use) provide insights into users’ personalities. The answer was clear cut. “Yes, automated analysis of these data does allow us to draw conclusions about the personalities of users, at least for most of the major dimensions of personality,” says Clemens Stachl, who used to work with Markus Bühner (Chair of Psychological Methodologies and Diagnostics at LMU) and is now a researcher at Stanford University in California.
    The LMU team recruited 624 volunteers for their PhoneStudy project. The participants agreed to fill out an extensive questionnaire describing their personality traits, and to install an app that had been developed specially for the study on their phones for 30 days. The app was designed to collect coded information relating to the behavior of the user. The researchers were primarily interested in data pertaining to communication patterns, social behavior and mobility, together with users’ choice and consumption of music, the selection of apps used, and the temporal distribution of their phone usage over the course of the day. All the data on personality and smartphone use were then analyzed with the aid of machine-learning algorithms, which were trained to recognize and extract patterns from the behavioral data, and relate these patterns to the information obtained from the personality surveys. The ability of the algorithms to predict the personality traits of the users was then cross-validated on the basis of a new dataset. “By far the most difficult part of the project was the pre-processing of the huge amount of data collected and the training of the predictive algorithms,” says Stachl. “In fact, in order to perform the necessary calculations, we had to resort to the cluster of high-performance computers at the Leibniz Supercomputing Centre in Garching (LRZ).”
    The researchers focused on the five most significant personality dimensions (the Big Five) identified by psychologists, which enable them to characterize personality differences between individuals in a comprehensive way. These dimensions relate to the self-assessed contribution of each of the following traits to a given individual’s personality: (1) openness (willingness to adopt new ideas, experiences and values), (2) conscientiousness (dependability, punctuality, ambitiousness and discipline), (3) extraversion (sociability, assertiveness, adventurousness, dynamism and friendliness), (4) agreeableness (willingness to trust others, good natured, outgoing, obliging, helpful) and (5) emotional stability (self-confidence, equanimity, positivity, self-control). The automated analysis revealed that the algorithm was indeed able to successfully derive most of these personality traits from combinations of the multifarious elements of their smartphone usage. Moreover, the results provide hints as to which types of digital behavior are most informative for specific self-assessments of personality. For example, data pertaining to communication patterns and social behavior (as reflected by smartphone use) correlated strongly with levels of self-reported extraversion, while information relating to patterns of day and night-time activity was significantly predictive of self-reported degrees of conscientiousness. Notably, links with the category ‘openness’ only became apparent when highly disparate types of data (e.g., app usage) were combined.
    The results of the study are of great value to researchers, as studies have so far been almost exclusively based on self-assessments. The conventional method has proven to be sufficiently reliable in predicting levels of professional success, for instance. “Nevertheless, we still know very little about how people actually behave in their everyday lives — apart from what they choose to tell us on our questionnaires,” says Markus Bühner. “Thanks to their broad distribution, their intensive use and their very high level of performance, smartphones are an ideal tool with which to probe the relationships between self-reported and real patterns of behavior.
    Clemens Stachl is aware that his research might further stimulate the appetites of the dominant IT firms for data. In addition to regulating the use of passively collected data and strengthening rights to privacy, we also need to take a comprehensive look at the field of artificial intelligence, he says. “The user, not the machine, must be the primary focus of research in this area. It would be a serious mistake to adopt machine-based methods of learning without serious consideration of their wider implications.” The potential of these applications — in both research and business — is tremendous. “The opportunities opened up by today’s data-driven society will undoubtedly improve the lives of large numbers of people,” Stachl says. “But we must ensure that all sections of the population share the benefits offered by digital technologies.” More

  • in

    New technology speeds up organic data transfer

    Researches are pushing the boundaries of data speed with a brand new type of organic LEDs.
    An international research team, involving Newcastle University experts, developed visible light communication (VLC) setup capable of a data rate of 2.2 Mb/s by employing a new type of organic light-emitting diodes (OLEDs).
    To reach this speed, the scientists created new far-red/near-infrared, solution-processed OLEDs. And by extending the spectral range to 700-1000nm, they successfully expanded the bandwidth and achieved the fastest-ever data speed for solution-based OLEDs.
    Described in the journal Light Science & Applications, the new OLEDs create opportunities for new internet-of-things (IoT) connectivity, as well as wearable and implantable biosensors technology.
    The project is a collaboration between Newcastle University, University College London, the London Centre for Nanotechnology, the Institute of Organic Chemistry — Polish Academy of Sciences (Warsaw, Poland) and the Institute for the Study of Nanostructured Materials — Research National Council (CNR-ISMN, Bologna, Italy).
    Dr Paul Haigh, Lecturer in Communications at Newcastle University’s Intelligent Sensing and Communications Group, was part of the research team. He led the development of a real-time transmission of signals that transmit as quickly as possible. He achieved this by using information modulation formats developed in-house, achieving approximately 2.2 Mb/s.
    Dr Haigh said: “Our team developed highly efficient long wavelength (far red/near-infrared) polymer LEDs for the first time, free of heavy metals which has been a long standing research challenge in the organic optoelectronics community. Achieving such high data rates opens up opportunities for the integration of portable, wearable or implantable organic biosensors into visible/ nearly (in)visible light communication links.”
    The demand for faster data transmission speeds is driving the popularity of light-emitting devices in VLC systems. LEDs have multiple applications and are used lighting systems, mobile phones and TV displays. While OLEDs don’t offer the same speed as inorganic LEDs and laser diodes do, they are cheaper to produce, recyclable and more sustainable.
    The data rate the team achieved through the pioneering device is high enough to support an indoor point-to-point link, with a view of IoT applications.
    The researchers highlight the possibility of achieving such data rates without computationally complex and power-demanding equalisers. Together with the absence of toxic heavy metals in the active layer of the OLEDs, the new VLC setup is promising for the integration of portable, wearable or implantable organic biosensors.

    Story Source:
    Materials provided by Newcastle University. Note: Content may be edited for style and length. More

  • in

    Will telehealth services become the norm following COVID-19 pandemic?

    The onset of the COVID-19 pandemic has broadly affected how health care is provided in the United States. One notable change is the expanded use of telehealth services, which have been quickly adopted by many health care providers and payers, including Medicare, to ensure patients’ access to care while reducing their risk of exposure to the coronavirus.
    In an article published in JAMA Oncology, Trevor Royce, MD, MS, MPH, an assistant professor of radiation oncology at the University of North Carolina Lineberger Comprehensive Cancer Center and UNC School of Medicine, said the routine use of telehealth for patients with cancer could have long-lasting and unforeseen effects on the provision and quality of care.
    “The COVID-19 pandemic has resulted in the rapid deregulation of telehealth services. This was done in part by lifting geographical restrictions, broadening patient, health care professional, and services eligibility,” said Royce, the article’s corresponding author. “It is likely aspects of telehealth continue to be part of the health care delivery system, beyond the pandemic.”
    The article’s other authors are UNC Lineberger’s Hanna K. Sanoff, MD, MPH, clinical medical director of the North Carolina Cancer Hospital and associate professor in the UNC School of Medicine Division of Hematology, and Amar Rewari, MD, MBA, from the Associates in Radiation Medicine, Adventist HealthCare Radiation Oncology Center in Rockville, Maryland.
    Royce said the widespread shift to telehealth was made possible, in part, by three federal economic stimulus packages and the Centers for Medicare and Medicaid Services making several policy changes in March that expanded Medicare recipients’ access to telehealth services.
    The policy changes included allowing telehealth services to be provided in a patient’s home. Medicare previously only paid for telehealth services in a facility in nonurban areas or areas with a health professional shortage. Medicare also approved payment for new patient appointments, expanded telehealth coverage to include 80 additional services, allowed for services to be carried out on a wider assortment of telecommunication systems — including remote video communications platforms, such as Zoom — and modified the restrictions of who can provide and supervise care.
    While the potential benefits of telehealth have been demonstrated during the pandemic, Royce said they must be balanced with concerns about care quality and safety.
    “There is a lot we don’t know about telehealth, and how its rapid adoption will impact our patients,” Royce said. “How will the safety and quality of care be impacted? How will we integrate essential components of the traditional doctor visit, including physical exam, lab work, scans and imaging? Will patients and doctors be more or less satisfied with their care? These are all potential downsides if we are not thoughtful with our adoption.”
    He said appropriate oversight of care is critical. There will be a continued need for objective patient assessments, such as patient-reported outcomes, physical examinations and laboratory tests, and to measure care quality and monitor for fraud. There are also a number of standard measures of care quality that can be implemented during the transition to telehealth, including tracking emergency room visits, hospitalizations and adverse events.
    Telehealth presents other challenges, as well. Though technology and internet access are now more widely available, they are not universally accessible. Where one lives, their socioeconomic status and comfort level with technology can be barriers to using telehealth services. A reliance on telehealth might lower participation in clinical trials, which can require regular in-person appointments.
    “Telehealth can be used to improve access to care in traditionally hard-to-reach populations. However, it is important to acknowledge that if we are not thoughtful in its adoption, the opposite could be true,” Royce said. “For example, will lower socioeconomic groups have the same level of access to an adequate internet connection or cellular services that make a virtual video visit possible? Telehealth needs to be adopted with equity in mind.” More

  • in

    'Blinking' crystals may convert CO2 into fuels

    Imagine tiny crystals that “blink” like fireflies and can convert carbon dioxide, a key cause of climate change, into fuels.
    A Rutgers-led team has created ultra-small titanium dioxide crystals that exhibit unusual “blinking” behavior and may help to produce methane and other fuels, according to a study in the journal Angewandte Chemie. The crystals, also known as nanoparticles, stay charged for a long time and could benefit efforts to develop quantum computers.
    “Our findings are quite important and intriguing in a number of ways, and more research is needed to understand how these exotic crystals work and to fulfill their potential,” said senior author Tewodros (Teddy) Asefa, a professor in the Department of Chemistry and Chemical Biology in the School of Arts and Sciences at Rutgers University-New Brunswick. He’s also a professor in the Department of Chemical and Biochemical Engineering in the School of Engineering.
    More than 10 million metric tons of titanium dioxide are produced annually, making it one of the most widely used materials, the study notes. It is used in sunscreens, paints, cosmetics and varnishes, for example. It’s also used in the paper and pulp, plastic, fiber, rubber, food, glass and ceramic industries.
    The team of scientists and engineers discovered a new way to make extremely small titanium dioxide crystals. While it’s still unclear why the engineered crystals blink and research is ongoing, the “blinking” is believed to arise from single electrons trapped on titanium dioxide nanoparticles. At room temperature, electrons — surprisingly — stay trapped on nanoparticles for tens of seconds before escaping and then become trapped again and again in a continuous cycle.
    The crystals, which blink when exposed to a beam of electrons, could be useful for environmental cleanups, sensors, electronic devices and solar cells, and the research team will further explore their capabilities.

    Story Source:
    Materials provided by Rutgers University. Note: Content may be edited for style and length. More

  • in

    Machining the heart: New predictor for helping to beat chronic heart failure

    Tens of millions of people worldwide have chronic heart failure, and only a little over half of them survive 5 years beyond their diagnosis. Now, researchers from Japan are helping doctors to assign patients into groups based on their specific needs, to improve medical outcomes.
    In a study recently published in the Journal of Nuclear Cardiology, researchers from Kanazawa University have used computer science to disentangle patients most at risk of sudden arrhythmic cardiac death from patients most at risk of heart failure death.
    Doctors have many methods at their disposal for diagnosing chronic heart failure. However, there’s a need to better identify what treatment to pursue, in accordance with the risks of each approach. When combined with conventional clinical tests, a molecule known as iodine-123 labelled MIBG can help discriminate between high-risk and low-risk patients. However, there is no way to assess the risk of arrhythmic death separately from the risk of heart failure death, something the researchers at Kanazawa University aimed to address.
    “We used artificial intelligence to show that numerous variables work in synergy to better predict chronic heart failure outcomes,” explains lead author of the study Kenichi Nakajima. “Neither variable, in and of itself, is quite up to the task.”
    To do this, the researchers examined the medical records of 526 patients with chronic heart failure and who underwent consecutive iodine-123-MIBG imaging and standard clinical testing. Conventional medical care proceeded as normal after imaging.
    “The results were clear,” says Nakajima. “Heart failure death was most common in older adult patients with very low MIBG activity, worse New York Heart Association class, and comorbidities.”
    Furthermore, arrhythmia was most common in younger patients with moderately low iodine-123-MIBG activity and less serious heart failure. Doctors can use the Kanazawa University researchers’ results to tailor medical care; for example, the type of implantable defibrillator most likely to meet the needs of the patient.
    “It’s important to note that our results need to be confirmed in a larger study,” explains Nakajima. “In particular, the arrhythmia outcomes were perhaps too infrequent to be clinically reliable.”
    Given that chronic heart failure is a global problem that frequently kills within a few years after diagnosis, if not treated appropriately, it’s essential to start the most appropriate medical care as soon as possible. With a reliable test that predicts which patients most likely need which treatments, a greater number of patients are likely to live longer.

    Story Source:
    Materials provided by Kanazawa University. Note: Content may be edited for style and length. More

  • in

    Recognizing fake images using frequency analysis

    They look deceptively real, but they are made by computers: so-called deep-fake images are generated by machine learning algorithms, and humans are pretty much unable to distinguish them from real photos. Researchers at the Horst Görtz Institute for IT Security at Ruhr-Universität Bochum and the Cluster of Excellence “Cyber Security in the Age of Large-Scale Adversaries” (Casa) have developed a new method for efficiently identifying deep-fake images. To this end, they analyse the objects in the frequency domain, an established signal processing technique.
    The team presented their work at the International Conference on Machine Learning (ICML) on 15 July 2020, one of the leading conferences in the field of machine learning. Additionally, the researchers make their code freely available online at https://github.com/RUB-SysSec/GANDCTAnalysis, so that other groups can reproduce their results.
    Interaction of two algorithms results in new images
    Deep-fake images — a portmanteau word from “deep learning” for machine learning and “fake” — are generated with the help of computer models, so-called Generative Adversarial Networks, GANs for short. Two algorithms work together in these networks: the first algorithm creates random images based on certain input data. The second algorithm needs to decide whether the image is a fake or not. If the image is found to be a fake, the second algorithm gives the first algorithm the command to revise the image — until it no longer recognises it as a fake.
    In recent years, this technique has helped make deep-fake images more and more authentic. On the website www.whichfaceisreal.com, users can check if they’re able to distinguish fakes from original photos. “In the era of fake news, it can be a problem if users don’t have the ability to distinguish computer-generated images from originals,” says Professor Thorsten Holz from the Chair for Systems Security.
    For their analysis, the Bochum-based researchers used the data sets that also form the basis of the above-mentioned page “Which face is real.” In this interdisciplinary project, Joel Frank, Thorsten Eisenhofer and Professor Thorsten Holz from the Chair for Systems Security cooperated with Professor Asja Fischer from the Chair of Machine Learning as well as Lea Schönherr and Professor Dorothea Kolossa from the Chair of Digital Signal Processing.
    Frequency analysis reveals typical artefacts
    To date, deep-fake images have been analysed using complex statistical methods. The Bochum group chose a different approach by converting the images into the frequency domain using the discrete cosine transform. The generated image is thus expressed as the sum of many different cosine functions. Natural images consist mainly of low-frequency functions.
    The analysis has shown that images generated by GANs exhibit artefacts in the high-frequency range. For example, a typical grid structure emerges in the frequency representation of fake images. “Our experiments showed that these artefacts do not only occur in GAN generated images. They are a structural problem of all deep learning algorithms,” explains Joel Frank from the Chair for Systems Security. “We assume that the artefacts described in our study will always tell us whether the image is a deep-fake image created by machine learning,” adds Frank. “Frequency analysis is therefore an effective way to automatically recognise computer-generated images.”

    Story Source:
    Materials provided by Ruhr-University Bochum. Note: Content may be edited for style and length. More

  • in

    Climate change made Siberia’s heat wave at least 600 times more likely

    The intense heat wave that gripped Siberia during the first half of 2020 would have been impossible without human-caused climate change, a new study finds. Researchers with the World Weather Attribution Network report that climate change made the prolonged heat in the region at least 600 times more likely — and possibly as much as 99,000 times more likely.
    “We wouldn’t expect the natural world to generate [such a heat wave] in anything less than 800,000 years or so,” climate scientist Andrew Ciavarella of the U.K. Met Office in Exeter, England, said July 14 in a news conference. It’s “effectively impossible without human influence.”
    The new study, posted online July 15, examined two aspects of the heat wave: the persistence and intensity of average temperatures across Siberia from January to June 2020; and daily maximum temperatures during June 2020 in the remote Siberian town of Verkhoyansk.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Tiny Verkhoyansk made international headlines when it logged a record high temperature of 38° Celsius (100.4° Fahrenheit) on June 20 (SN: 6/23/20). The record was just one extreme amid a larger and longer event in the region that has led to a series of human and natural disasters (SN: 7/1/20). Those include wildfires across Siberia, the collapse of a fuel tank in the mining city of Norilsk due to sagging permafrost, and heat health effects (SN: 4/3/18).
    Using observational data from Verkhoyansk and other Siberian weather stations, the researchers first assessed the rarity of the observed temperatures and determined temperature trends. Then they compared these observations with hundreds of climate simulations using different greenhouse gas warming scenarios. 

    Had such a hot spell occurred in 1900 instead of 2020, it would have been at least 2 degrees cooler on average, the researchers found. In Verkhoyansk, climate change amped up June temperatures by at least 1 degree relative to 1900. And such heat waves are likely to become more common in the near future, the scientists found: By 2050, temperatures in Siberia could increase by between 2.5 degrees to as much as 7 degrees compared to the year 1900, the report finds. More

  • in

    Marine drifters: Interdisciplinary study explores plankton diversity

    Ocean plankton are the drifters of the marine world. They’re algae, animals, bacteria, or protists that are at the mercy of the tide and currents. Many are microscopic and hidden from view, barely observable with the naked eye. Though others, like jellyfish, can grow relatively large.
    There’s one thing about these drifting critters that has puzzled ecologists for decades — the diversity among ocean plankton is much higher than expected. Generally, in any given ocean sample, there are many rare species of plankton and a small number of abundant species. Researchers from the Okinawa Institute of Science and Technology Graduate University (OIST) have published a paper in Science Advances that combines mathematical models with metagenomics and marine science to uncover why this might be the case.
    “For years, scientists have been asking why there are so many species in the ocean,” said Professor Simone Pigolotti, who leads OIST’s Biological Complexity Unit. Professor Pigolotti explained that plankton can be transported across very large distances by currents, so they don’t seem to be limited by dispersal. This would suggest that niche preference is the factor that determines species diversity — in other words, a single species will outcompete all other species if the environment suits them best, leading to communities with only a few, highly abundant species.
    “Our research explored the theory that ocean currents promote species diversity, not because they help plankton to disperse, but because they can actually limit dispersal by creating barriers,” said Professor Pigolotti. “In contrast, when we looked at samples from lakes, where there are little or no currents, we found more abundant species, but fewer species altogether.”
    At first glance, this might seem counter-intuitive. But while currents may carry plankton from one area to another, they also prevent the plankton from crossing to the other side of the current. Thus, these currents reduce competition and force each species of plankton to coexist with other species, albeit in small numbers.
    Combining DNA tests with mathematical models
    For over a century, ecologists have measured diversity by counting the number of species, such as birds or insects, in an area. This allowed them to find the proportions of abundant species versus rare species. Today, the task is streamlined through both quantitative modelling that can predict species distributions and metagenomics — instead of just counting species, researchers can efficiently collect all the DNA in a sample.

    advertisement

    “Simply counting the amount of species in a sample is very time consuming,” said Professor Tom Bourguignon, who leads OIST’s Evolutionary Genomics Unit. “With advancements in sequencing technologies, we can run just one test and have several thousand DNA sequences that represents a good estimation of planktonic diversity.”
    For this study, the researchers were particularly interested in protists — microscopic, usually single-celled, planktonic organisms. The group created a mathematical model that considered the role of oceanic currents in determining the genealogy of protists through simulations. They couldn’t just simulate a protist community at the DNA level because there would be a huge number of individuals. So, instead, they simulated the individuals in a given sample from the ocean.
    To find out how closely related the individuals were, and whether they were of the same species, the researchers then looked back in time. “We created a trajectory that went back one hundred years,” said Professor Pigolotti. “If two individuals came from a common ancestor in the timescale of our simulation, then we classed them as the same species.”
    What they were specifically measuring was the number of species, and the number of individuals per species. The model was simulated with and without ocean currents. As the researchers had hypothesized, it showed that the presence of ocean currents caused a sharp increase in the number of protist species, but a decline in the number of individuals per species.
    To confirm the results of this model, the researchers then analyzed datasets from two studies of aquatic protists. The first dataset was of oceanic protists’ DNA sequences and the second, freshwater protists’ DNA sequences. They found that, on average, oceanic samples contained more rare species and less abundant species and, overall, had a larger number of species. This agreed with the model’s predictions.
    “Our results support the theory that ocean currents positively impact the diversity of rare aquatic protists by creating these barriers,” said Professor Pigolotti. “The project was very interdisciplinary. By combining theoretical physics, marine science, and metagenomics, we’ve shed new light on a classic problem in ecology, which is of relevance for marine biodiversity.” More