More stories

  • in

    Quantum electronics: 'Bite' defects in bottom-up graphene nanoribbons

    Graphene nanoribbons (GNRs), narrow strips of single-layer graphene, have interesting physical, electrical, thermal, and optical properties because of the interplay between their crystal and electronic structures. These novel characteristics have pushed them to the forefront in the search for ways to advance next-generation nanotechnologies.
    While bottom-up fabrication techniques now allow the synthesis of a broad range of graphene nanoribbons that feature well-defined edge geometries, widths, and heteroatom incorporations, the question of whether or not structural disorder is present in these atomically precise GNRs, and to what extent, is still subject to debate. The answer to this riddle is of critical importance to any potential applications or resulting devices.
    Collaboration between Oleg Yazyev’s Chair of Computational Condensed Matter Physics theory group at EPFL and Roman Fasel’s experimental nanotech@surfaces Laboratory at Empa has produced two papers that look at this issue in armchair-edged and zigzag-edged graphene nanoribbons.
    “In these two works, we focused on characterizing “bite-defects” in graphene nanoribbons and their implications on GNR properties,” explains Gabriela Borin Barin from Empa’s nanotech@surfaces lab. “We observed that even though the presence of these defects can disrupt GNRs’ electronic transport, they could also yield spin-polarized currents. These are important findings in the context of the potential applications of GNRs in nanoelectronics and quantum technology.”
    Armchair graphene nanoribbons
    The paper “Quantum electronic transport across “bite” defects in graphene nanoribbons,” recently published in 2D Materials, specifically looks at 9-atom wide armchair graphene nanoribbons (9-AGNRs). The mechanical robustness, long-term stability under ambient conditions, easy transferability onto target substrates, scalability of fabrication, and suitable band-gap width of these GNRs has made them one of the most promising candidates for integration as active channels in field-effect transistors (FETs). Indeed, among the graphene-based electronic devices realized so far, 9-AGNR-FETs display the highest performance. More

  • in

    Data from smartwatches can help predict clinical blood test results

    Smartwatches and other wearable devices may be used to sense illness, dehydration and even changes to the red blood cell count, according to biomedical engineers and genomics researchers at Duke University and the Stanford University School of Medicine.
    The researchers say that, with the help of machine learning, wearable device data on heart rate, body temperature and daily activities may be used to predict health measurements that are typically observed during a clinical blood test. The study appears in Nature Medicine on May 24, 2021.
    During a doctor’s office visit, a medical worker usually measures a patient’s vital signs, including their height, weight, temperature and blood pressure. Although this information is filed away in a person’s long-term health record, it isn’t usually used to create a diagnosis. Instead, physicians will order a clinical lab, which tests a patient’s urine or blood, to gather specific biological information to help guide health decisions.
    These vital measurements and clinical tests can inform a doctor about specific changes to a person’s health, like if a patient has diabetes or has developed pre-diabetes, if they’re getting enough iron or water in their diet, and if their red or white blood cell count is in the normal range.
    But these tests are not without their drawbacks. They require an in-person visit, which isn’t always easy for patients to arrange, and procedures like a blood draw can be invasive and uncomfortable. Most notably, these vitals and clinical samples are not usually taken at regular and controlled intervals. They only provide a snapshot of a patient’s health on the day of the doctor’s visit, and the results can be influenced by a host of factors, like when a patient last ate or drank, stress, or recent physical activity.
    “There is a circadian (daily) variation in heart rate and in body temperature, but these single measurements in clinics don’t capture that natural variation,” said Duke’s Jessilyn Dunn, a co-lead and co-corresponding author of the study. “But devices like smartwatches or Fitbits have the ability to track these measurements and natural changes over a prolonged period of time and identify when there is variation from that natural baseline.”
    To gain a consistent and fuller picture of patients’ health, Dunn, an assistant professor of biomedical engineering at Duke, Michael Snyder, a professor and chair of genetics at Stanford, and their team wanted to explore if long-term data gathered from wearable devices could match changes that were observed during clinical tests and help indicate health abnormalities. More

  • in

    Machine learning platform identifies activated neurons in real-time

    Biomedical engineers at Duke University have developed an automatic process that uses streamlined artificial intelligence (AI) to identify active neurons in videos faster and more accurately than current techniques.
    The technology should allow researchers to watch an animal’s brain activity in real time, as they are behaving.
    The work appears May 20 in Nature Machine Intelligence.
    One of the ways researchers study the activity of neurons in living animals is through a process known as two-photon calcium imaging, which makes active neurons appear as flashes of light. Analyzing these videos, however, typically requires a human circling every burst of intensity they see in a process called segmentation. While this may seem straightforward, these bursts often overlap in spaces where thousands of neurons are imaged simultaneously. Analyzing just a five-minute video this way could take weeks or even months.
    “People try to figure out how the brain works by recording the activity of neurons as an animal does a behavior to study the relationship between the two,” said Yiyang Gong, the primary author on the paper. “But manual segmentation creates a big bottleneck and doesn’t allow researchers to see the activation of the neurons in real-time.”
    Gong, an assistant professor of biomedical engineering, and Sina Farsiu, a professor of biomedical engineering, previously addressed this bottleneck in a 2019 paper, where they shared the development of a deep-learning platform that maps active neurons as accurately as humans in a fraction of the time. But because videos can be tens of gigabytes, researchers still have to wait hours or days for them to process. More

  • in

    AI spots neurons better than human experts

    A new combination of optical coherence tomography (OCT), adaptive optics and deep neural networks should enable better diagnosis and monitoring for neuron-damaging eye and brain diseases like glaucoma.
    Biomedical engineers at Duke University led a multi-institution consortium to develop the process, which easily and precisely tracks changes in the number and shape of retinal ganglion cells in the eye.
    This work appears in a paper published on May 3 in the journal Optica.
    The retina of the eye is an extension of the central nervous system. Ganglion cells are one of the primary neurons in the eye that process and send visual information to the brain. In many neurodegenerative diseases like glaucoma, ganglion cells degenerate and disappear, leading to irreversible blindness. Traditionally, researchers use OCT, an imaging technology similar to ultrasound that uses light instead of sound, to peer beneath layers of eye tissue to diagnose and track the progression of glaucoma and other eye diseases.
    Although OCT allows researchers to efficiently view the ganglion cell layer in the retina, the technique is only sensitive enough to show the thickness of the cell layer — it can’t reveal individual ganglion cells. This hinders early diagnosis or rapid tracking of the disease progression, as large quantities of ganglion cells need to disappear before physicians can see the changes in thickness.
    To remedy this, a recent technology called adaptive optics OCT (AO-OCT) enables imaging sensitive enough to view individual ganglion cells. Adaptive optics is a technology that minimizes the effect of optical aberrations that occur when examining the eye, which are a major limiting factor in achieving high-resolution in OCT imaging. More

  • in

    Quantum sensing: Odd angles make for strong spin-spin coupling

    Sometimes things are a little out of whack, and it turns out to be exactly what you need.
    That was the case when orthoferrite crystals turned up at a Rice University laboratory slightly misaligned. Those crystals inadvertently became the basis of a discovery that should resonate with researchers studying spintronics-based quantum technology.
    Rice physicist Junichiro Kono, alumnus Takuma Makihara and their collaborators found an orthoferrite material, in this case yttrium iron oxide, placed in a high magnetic field showed uniquely tunable, ultrastrong interactions between magnons in the crystal.
    Orthoferrites are iron oxide crystals with the addition of one or more rare-earth elements.
    Magnons are quasiparticles, ghostly constructs that represent the collective excitation of electron spin in a crystal lattice.
    What one has to do with the other is the basis of a study that appears in Nature Communications, where Kono and his team describe an unusual coupling between two magnons dominated by antiresonance, through which both magnons gain or lose energy simultaneously. More

  • in

    Young teens should only use recreational internet and video games one hour daily

    Middle-school aged children who use the internet, social media or video games recreationally for more than an hour each day during the school week have significantly lower grades and test scores, according to a study from the Center for Gambling Studies at Rutgers University-New Brunswick.
    The findings appear in the journal Computers in Human Behavior.
    Researchers say the findings give parents and children a moderate threshold for using entertainment-related technology — no more than one hour daily on school days and four hours a day on weekends.
    “Interactive technology is widely used to promote children’s educational access and achievement,” said lead author Vivien (Wen Li) Anthony, an assistant professor at the School of Social Work and research associate at the Rutgers Center for Gambling Studies. “During the COVID-19 pandemic, technology has been essential to facilitating remote learning. At the same time, there is a growing concern that excessive technology use, particularly for entertainment, may adversely affect children’s educational development by facilitating undesirable study habits and detracting from time spent on learning activities.”
    The researchers, which include Professor Lia Nower of the Rutgers Center for Gambling Studies and a researcher from Renmin University of China, analyzed the China Education Panel Survey data, a national survey of educational needs and outcomes of children in China. Approximately 10,000 first-year middle school students were surveyed and followed. Their average age was 13.5 years.
    The results showed that children who used the internet, social media or video games for entertainment four or more hours daily were four times more likely to skip school than those who did not. Boys used interactive technology for entertainment significantly more than girls. Boys also performed worse and showed lower school engagement levels than girls.
    “Such findings are critical, particularly in light of the recent movement toward online learning in countries throughout the world,” said Anthony. “In a learning environment that integrates the internet, it is easy for children to move across educational and entertainment platforms during learning without alerting teachers or adults to alternate activities.”
    Anthony said children in the study who used technology in moderation (i.e., less than one hour per day on weekends) experienced less boredom at school, potentially due to the positive effects of participation in social media, video games and video streaming such as peer bonding and relationship building. Using interactive technology for entertainment in moderation advanced children’s cognitive development.
    The findings suggest that parents place time limits on their children’s interactive technology use, and that parents and teachers should help children to develop effective time management and self-regulation skills to reduce their reliance on technology.
    Story Source:
    Materials provided by Rutgers University. Note: Content may be edited for style and length. More

  • in

    Pristine quantum criticality found

    U.S. and Austrian physicists searching for evidence of quantum criticality in topological materials have found one of the most pristine examples yet observed.
    In an open access paper published online in Science Advances, researchers from Rice University, Johns Hopkins University, the Vienna University of Technology (TU Wien) and the National Institute of Standards and Technology (NIST) present the first experimental evidence to suggest that quantum criticality — a disordered state in which electrons waver between competing states of order — may give rise to topological phases, “protected” quantum states that are of growing interest for quantum computation.
    “The thought that underlies this work is, ‘Why not quantum criticality?'” said study co-author Qimiao Si, a theoretical physicist from Rice who’s spent two decades studying the interplay between quantum criticality and one of the most mysterious processes in modern physics, high-temperature superconductivity.
    “Maybe quantum criticality is not the only mechanism that can nucleate topological phases of matter, but we know quantum criticality provides a setting in which things are fluctuating and from which new states of matter can emerge,” said Si, director of the Rice Center for Quantum Materials (RCQM).
    In the study Si and colleagues, including experimentalist Silke Bühler-Paschen, a longtime collaborator at TU Wien, and Collin Broholm of both NIST and Johns Hopkins, studied a semimetal made from one part cerium, four parts ruthenium and six parts tin. Topological phases have not been observed in CeRu4Sn6, but it is similar to a number of other materials in which these have been observed. And it is known to host the Kondo effect, a strong interaction between the magnetic moments of electrons attached to atoms in a metal and the spins of passing conduction electrons.
    In typical metals and semiconductors, interactions between electrons are weak enough that engineers and physicists need not take them into account when designing a computer chip or other electronic device. Not so in “strongly correlated” materials, like Kondo semimetals. In these, the overall behavior of the material — and of any device built from it — relies on electron-electron interactions. And these are the interactions that give rise to quantum criticality. More

  • in

    Accurate evaluation of CRISPR genome editing

    CRISPR technology allows researchers to edit genomes by altering DNA sequences and by thus modifying gene function. Its many potential applications include correcting genetic defects, treating and preventing the spread of diseases and improving crops.
    Genome editing tools, such as the CRISPR-Cas9 technology, can be engineered to make extremely well-defined alterations to the intended target on a chromosome where a particular gene or functional element is located. However, one potential complication is that CRISPR editing may lead to other, unintended, genomic changes. These are known as off-target activity. When targeting several different sites in the genome off target activity can lead to translocations, unusual rearrangement of chromosomes, as well as to other unintended genomic modifications.
    Controlling off-target editing activity is one of the central challenges in making CRISPR-Cas9 technology accurate and applicable in medical practice. Current measurement assays and data analysis methods for quantifying off-target activity do not provide statistical evaluation, are not sufficiently sensitive in separating signal from noise in experiments with low editing rates, and require cumbersome efforts to address the detection of translocations.
    A multidisciplinary team of researchers from the Interdisciplinary Center Herzliya and Bar-Ilan University report in the May 24th issue of the journal Nature Communications the development of a new software tool to detect, evaluate and quantify off-target editing activity, including adverse translocation events that can cause cancer. The software is based on input taken from a standard measurement assay, involving multiplexed PCR amplification and Next-Generation Sequencing (NGS).
    Known as CRISPECTOR, the tool analyzes next generation sequencing data obtained from CRISPR-Cas9 experiments, and applies statistical modeling to determine and quantify editing activity. CRISPECTOR accurately measures off-target activity at every interrogated locus. It further enables better false-negative rates in sites with weak, yet significant, off-target activity. Importantly, one of the novel features of CRISPECTOR is its ability to detect adverse translocation events occurring in an editing experiment.
    “In genome editing, especially for clinical applications, it is critical to identify low level off-target activity and adverse translocation events. Even a small number of cells with carcinogenic potential, when transplanted into a patient in the context of gene therapy, can have detrimental consequences in terms of cancer pathogenesis. As part of treatment protocols, it is therefore important to detect these potential events in advance,” says Dr. Ayal Hendel, of Bar-Ilan University’s Mina and Everard Goodman Faculty of Life Sciences. Dr. Hendel led the study together with Prof. Zohar Yakhini, of the Arazi School of Computer Science at Interdisciplinary Center (IDC) Herzliya. “CRISPECTOR provides an effective method to characterize and quantify potential CRISPR-induced errors, thereby significantly improving the safety of future clinical use of genome editing.” Hendel’s team utilized CRISPR-Cas9 technology to edit genes in stem cells relevant to disorders of the blood and the immune system. In the process of analyzing the data they became aware of the shortcomings of the existing tools for quantifying off-target activity and of gaps that should be bridged to improve applicability. This experience led to the collaboration with Prof Yakhini’s leading computational biology and bioinformatics group.
    Prof. Zohar Yakhini, of IDC Herzliya and the Technion, adds that “in experiments utilizing deep sequencing techniques that have significant levels of background noise, low levels of true off-target activity can get lost under the noise. The need for a measurement approach and related data analysis that are capable of seeing beyond the noise, as well as of detecting adverse translocation events occurring in an editing experiment, is evident to genome editing scientists and practitioners. CRISPECTOR is a tool that can sift through the background noise to identify and quantify true off-target signal. Moreover, using statistical modelling and careful analysis of the data CRISPECTOR can also identify a wider spectrum of genomic aberrations. By characterizing and quantifying potential CRISPR-induced errors our methods will support the safer clinical use of genome editing therapeutic approaches.”
    The Hendel Lab and the Yakhini Research Group plan to apply the tool towards the study of potential therapies for genetic disorders of the immune system and of immunotherapy approaches in cancer.
    The study is a collaboration between the Hendel Lab at Bar-Ilan University (BIU) and the Yakhini Research Group (IDC Herzliya and the Technion). The project was led by Ido Amit (IDC) and Ortal Iancu (BIU). Also participating in this research were Daniel Allen, Dor Breier and Nimrod Ben Haim (BIU); Alona Levy-Jurgenson (Technion); Leon Anavy (Technion and IDC); Gavin Kurgan, Matthew S. McNeil, Garrett R. Rettig and Yu Wang (Integrated DNA Technologies, Inc. (IDT, US)). Additional contributors included Chihong Choi (IDC) and Mark Behlke (IDT, US).
    This study was supported by a grant from the European Research Council (ERC) under the Horizon 2020 research and innovation program, and the Adams Fellowships Program of the Israel Academy of Sciences and Humanities.
    Story Source:
    Materials provided by Bar-Ilan University. Note: Content may be edited for style and length. More