More stories

  • in

    Scientific software – Quality not always good

    Computational tools are indispensable in almost all scientific disciplines. Especially in cases where large amounts of research data are generated and need to be quickly processed, reliable, carefully developed software is crucial for analyzing and correctly interpreting such data. Nevertheless, scientific software can have quality quality deficiencies. To evaluate software quality in an automated way, computer scientists at Karlsruhe Institute of Technology (KIT) and Heidelberg Institute for Theoretical Studies (HITS) have designed the SoftWipe tool.
    “Adherence to coding standards is rarely considered in scientific software, although it can even lead to incorrect scientific results,” says Professor Alexandros Stamatakis, who works both at HITS and at the Institute of Theoretical Informatics (ITI) of KIT. The open-source SoftWipe software tool provides a fast, reliable, and cost-effective approach to addressing this problem by automatically assessing adherence to software development standards. Besides designing the above-mentioned tool, the computer scientists benchmarked 48 scientific software tools from different research areas, to assess to which degree they met coding standards.
    “SoftWipe can also be used in the review process of scientific software and support the software selection process,” adds Adrian Zapletal. The Master’s student and his fellow student Dimitri Höhler have substantially contributed to the development of SoftWipe. To select assessment criteria, they relied on existing standards that are used in safety-critical environments, such as at NASA or CERN.
    “Our research revealed enormous discrepancies in software quality,” says co-author Professor Carsten Sinz of ITI. Many programs, such as covid-sim, which is used in the UK for mathematical modeling of the COVID-19 disease, had a very low quality score and thus performed poorly in the ranking. The researchers recommend using programs such as SoftWipe by default in the selection and review process of software for scientific purposes.
    How Does SoftWipe Work?
    SoftWipe is a pipeline written in the Python3 programming language that uses several available static and dynamic code analyzers (most of them are freely available) in order to assess the code quality of software written in C/C++. In this process, SoftWipe compiles the software and then executes it so that programming errors can be detected during execution. Based on the output of the code analysis tools used, SoftWipe calculates a quality score between 0 (poor) and 10 (excellent) to compute an overall final score .
    Story Source:
    Materials provided by Karlsruher Institut für Technologie (KIT). Note: Content may be edited for style and length. More

  • in

    Quantum electronics: 'Bite' defects in bottom-up graphene nanoribbons

    Graphene nanoribbons (GNRs), narrow strips of single-layer graphene, have interesting physical, electrical, thermal, and optical properties because of the interplay between their crystal and electronic structures. These novel characteristics have pushed them to the forefront in the search for ways to advance next-generation nanotechnologies.
    While bottom-up fabrication techniques now allow the synthesis of a broad range of graphene nanoribbons that feature well-defined edge geometries, widths, and heteroatom incorporations, the question of whether or not structural disorder is present in these atomically precise GNRs, and to what extent, is still subject to debate. The answer to this riddle is of critical importance to any potential applications or resulting devices.
    Collaboration between Oleg Yazyev’s Chair of Computational Condensed Matter Physics theory group at EPFL and Roman Fasel’s experimental nanotech@surfaces Laboratory at Empa has produced two papers that look at this issue in armchair-edged and zigzag-edged graphene nanoribbons.
    “In these two works, we focused on characterizing “bite-defects” in graphene nanoribbons and their implications on GNR properties,” explains Gabriela Borin Barin from Empa’s nanotech@surfaces lab. “We observed that even though the presence of these defects can disrupt GNRs’ electronic transport, they could also yield spin-polarized currents. These are important findings in the context of the potential applications of GNRs in nanoelectronics and quantum technology.”
    Armchair graphene nanoribbons
    The paper “Quantum electronic transport across “bite” defects in graphene nanoribbons,” recently published in 2D Materials, specifically looks at 9-atom wide armchair graphene nanoribbons (9-AGNRs). The mechanical robustness, long-term stability under ambient conditions, easy transferability onto target substrates, scalability of fabrication, and suitable band-gap width of these GNRs has made them one of the most promising candidates for integration as active channels in field-effect transistors (FETs). Indeed, among the graphene-based electronic devices realized so far, 9-AGNR-FETs display the highest performance. More

  • in

    Data from smartwatches can help predict clinical blood test results

    Smartwatches and other wearable devices may be used to sense illness, dehydration and even changes to the red blood cell count, according to biomedical engineers and genomics researchers at Duke University and the Stanford University School of Medicine.
    The researchers say that, with the help of machine learning, wearable device data on heart rate, body temperature and daily activities may be used to predict health measurements that are typically observed during a clinical blood test. The study appears in Nature Medicine on May 24, 2021.
    During a doctor’s office visit, a medical worker usually measures a patient’s vital signs, including their height, weight, temperature and blood pressure. Although this information is filed away in a person’s long-term health record, it isn’t usually used to create a diagnosis. Instead, physicians will order a clinical lab, which tests a patient’s urine or blood, to gather specific biological information to help guide health decisions.
    These vital measurements and clinical tests can inform a doctor about specific changes to a person’s health, like if a patient has diabetes or has developed pre-diabetes, if they’re getting enough iron or water in their diet, and if their red or white blood cell count is in the normal range.
    But these tests are not without their drawbacks. They require an in-person visit, which isn’t always easy for patients to arrange, and procedures like a blood draw can be invasive and uncomfortable. Most notably, these vitals and clinical samples are not usually taken at regular and controlled intervals. They only provide a snapshot of a patient’s health on the day of the doctor’s visit, and the results can be influenced by a host of factors, like when a patient last ate or drank, stress, or recent physical activity.
    “There is a circadian (daily) variation in heart rate and in body temperature, but these single measurements in clinics don’t capture that natural variation,” said Duke’s Jessilyn Dunn, a co-lead and co-corresponding author of the study. “But devices like smartwatches or Fitbits have the ability to track these measurements and natural changes over a prolonged period of time and identify when there is variation from that natural baseline.”
    To gain a consistent and fuller picture of patients’ health, Dunn, an assistant professor of biomedical engineering at Duke, Michael Snyder, a professor and chair of genetics at Stanford, and their team wanted to explore if long-term data gathered from wearable devices could match changes that were observed during clinical tests and help indicate health abnormalities. More

  • in

    Machine learning platform identifies activated neurons in real-time

    Biomedical engineers at Duke University have developed an automatic process that uses streamlined artificial intelligence (AI) to identify active neurons in videos faster and more accurately than current techniques.
    The technology should allow researchers to watch an animal’s brain activity in real time, as they are behaving.
    The work appears May 20 in Nature Machine Intelligence.
    One of the ways researchers study the activity of neurons in living animals is through a process known as two-photon calcium imaging, which makes active neurons appear as flashes of light. Analyzing these videos, however, typically requires a human circling every burst of intensity they see in a process called segmentation. While this may seem straightforward, these bursts often overlap in spaces where thousands of neurons are imaged simultaneously. Analyzing just a five-minute video this way could take weeks or even months.
    “People try to figure out how the brain works by recording the activity of neurons as an animal does a behavior to study the relationship between the two,” said Yiyang Gong, the primary author on the paper. “But manual segmentation creates a big bottleneck and doesn’t allow researchers to see the activation of the neurons in real-time.”
    Gong, an assistant professor of biomedical engineering, and Sina Farsiu, a professor of biomedical engineering, previously addressed this bottleneck in a 2019 paper, where they shared the development of a deep-learning platform that maps active neurons as accurately as humans in a fraction of the time. But because videos can be tens of gigabytes, researchers still have to wait hours or days for them to process. More

  • in

    AI spots neurons better than human experts

    A new combination of optical coherence tomography (OCT), adaptive optics and deep neural networks should enable better diagnosis and monitoring for neuron-damaging eye and brain diseases like glaucoma.
    Biomedical engineers at Duke University led a multi-institution consortium to develop the process, which easily and precisely tracks changes in the number and shape of retinal ganglion cells in the eye.
    This work appears in a paper published on May 3 in the journal Optica.
    The retina of the eye is an extension of the central nervous system. Ganglion cells are one of the primary neurons in the eye that process and send visual information to the brain. In many neurodegenerative diseases like glaucoma, ganglion cells degenerate and disappear, leading to irreversible blindness. Traditionally, researchers use OCT, an imaging technology similar to ultrasound that uses light instead of sound, to peer beneath layers of eye tissue to diagnose and track the progression of glaucoma and other eye diseases.
    Although OCT allows researchers to efficiently view the ganglion cell layer in the retina, the technique is only sensitive enough to show the thickness of the cell layer — it can’t reveal individual ganglion cells. This hinders early diagnosis or rapid tracking of the disease progression, as large quantities of ganglion cells need to disappear before physicians can see the changes in thickness.
    To remedy this, a recent technology called adaptive optics OCT (AO-OCT) enables imaging sensitive enough to view individual ganglion cells. Adaptive optics is a technology that minimizes the effect of optical aberrations that occur when examining the eye, which are a major limiting factor in achieving high-resolution in OCT imaging. More

  • in

    Quantum sensing: Odd angles make for strong spin-spin coupling

    Sometimes things are a little out of whack, and it turns out to be exactly what you need.
    That was the case when orthoferrite crystals turned up at a Rice University laboratory slightly misaligned. Those crystals inadvertently became the basis of a discovery that should resonate with researchers studying spintronics-based quantum technology.
    Rice physicist Junichiro Kono, alumnus Takuma Makihara and their collaborators found an orthoferrite material, in this case yttrium iron oxide, placed in a high magnetic field showed uniquely tunable, ultrastrong interactions between magnons in the crystal.
    Orthoferrites are iron oxide crystals with the addition of one or more rare-earth elements.
    Magnons are quasiparticles, ghostly constructs that represent the collective excitation of electron spin in a crystal lattice.
    What one has to do with the other is the basis of a study that appears in Nature Communications, where Kono and his team describe an unusual coupling between two magnons dominated by antiresonance, through which both magnons gain or lose energy simultaneously. More

  • in

    Young teens should only use recreational internet and video games one hour daily

    Middle-school aged children who use the internet, social media or video games recreationally for more than an hour each day during the school week have significantly lower grades and test scores, according to a study from the Center for Gambling Studies at Rutgers University-New Brunswick.
    The findings appear in the journal Computers in Human Behavior.
    Researchers say the findings give parents and children a moderate threshold for using entertainment-related technology — no more than one hour daily on school days and four hours a day on weekends.
    “Interactive technology is widely used to promote children’s educational access and achievement,” said lead author Vivien (Wen Li) Anthony, an assistant professor at the School of Social Work and research associate at the Rutgers Center for Gambling Studies. “During the COVID-19 pandemic, technology has been essential to facilitating remote learning. At the same time, there is a growing concern that excessive technology use, particularly for entertainment, may adversely affect children’s educational development by facilitating undesirable study habits and detracting from time spent on learning activities.”
    The researchers, which include Professor Lia Nower of the Rutgers Center for Gambling Studies and a researcher from Renmin University of China, analyzed the China Education Panel Survey data, a national survey of educational needs and outcomes of children in China. Approximately 10,000 first-year middle school students were surveyed and followed. Their average age was 13.5 years.
    The results showed that children who used the internet, social media or video games for entertainment four or more hours daily were four times more likely to skip school than those who did not. Boys used interactive technology for entertainment significantly more than girls. Boys also performed worse and showed lower school engagement levels than girls.
    “Such findings are critical, particularly in light of the recent movement toward online learning in countries throughout the world,” said Anthony. “In a learning environment that integrates the internet, it is easy for children to move across educational and entertainment platforms during learning without alerting teachers or adults to alternate activities.”
    Anthony said children in the study who used technology in moderation (i.e., less than one hour per day on weekends) experienced less boredom at school, potentially due to the positive effects of participation in social media, video games and video streaming such as peer bonding and relationship building. Using interactive technology for entertainment in moderation advanced children’s cognitive development.
    The findings suggest that parents place time limits on their children’s interactive technology use, and that parents and teachers should help children to develop effective time management and self-regulation skills to reduce their reliance on technology.
    Story Source:
    Materials provided by Rutgers University. Note: Content may be edited for style and length. More

  • in

    Pristine quantum criticality found

    U.S. and Austrian physicists searching for evidence of quantum criticality in topological materials have found one of the most pristine examples yet observed.
    In an open access paper published online in Science Advances, researchers from Rice University, Johns Hopkins University, the Vienna University of Technology (TU Wien) and the National Institute of Standards and Technology (NIST) present the first experimental evidence to suggest that quantum criticality — a disordered state in which electrons waver between competing states of order — may give rise to topological phases, “protected” quantum states that are of growing interest for quantum computation.
    “The thought that underlies this work is, ‘Why not quantum criticality?'” said study co-author Qimiao Si, a theoretical physicist from Rice who’s spent two decades studying the interplay between quantum criticality and one of the most mysterious processes in modern physics, high-temperature superconductivity.
    “Maybe quantum criticality is not the only mechanism that can nucleate topological phases of matter, but we know quantum criticality provides a setting in which things are fluctuating and from which new states of matter can emerge,” said Si, director of the Rice Center for Quantum Materials (RCQM).
    In the study Si and colleagues, including experimentalist Silke Bühler-Paschen, a longtime collaborator at TU Wien, and Collin Broholm of both NIST and Johns Hopkins, studied a semimetal made from one part cerium, four parts ruthenium and six parts tin. Topological phases have not been observed in CeRu4Sn6, but it is similar to a number of other materials in which these have been observed. And it is known to host the Kondo effect, a strong interaction between the magnetic moments of electrons attached to atoms in a metal and the spins of passing conduction electrons.
    In typical metals and semiconductors, interactions between electrons are weak enough that engineers and physicists need not take them into account when designing a computer chip or other electronic device. Not so in “strongly correlated” materials, like Kondo semimetals. In these, the overall behavior of the material — and of any device built from it — relies on electron-electron interactions. And these are the interactions that give rise to quantum criticality. More