More stories

  • in

    Infinite chains of hydrogen atoms have surprising properties, including a metallic phase

    An infinite chain of hydrogen atoms is just about the simplest bulk material imaginable — a never-ending single-file line of protons surrounded by electrons. Yet a new computational study combining four cutting-edge methods finds that the modest material boasts fantastic and surprising quantum properties.
    By computing the consequences of changing the spacing between the atoms, an international team of researchers from the Flatiron Institute and the Simons Collaboration on the Many Electron Problem found that the hydrogen chain’s properties can be varied in unexpected and drastic ways. That includes the chain transforming from a magnetic insulator into a metal, the researchers report September 14 in Physical Review X.
    The computational methods used in the study present a significant step toward custom-designing materials with sought-after properties, such as the possibility of high-temperature superconductivity in which electrons flow freely through a material without losing energy, says the study’s senior author Shiwei Zhang. Zhang is a senior research scientist at the Center for Computational Quantum Physics (CCQ) at the Simons Foundation’s Flatiron Institute in New York City.
    “The main purpose was to apply our tools to a realistic situation,” Zhang says. “Almost as a side product, we discovered all of this interesting physics of the hydrogen chain. We didn’t think that it would be as rich as it turned out to be.”
    Zhang, who is also a chancellor professor of physics at the College of William and Mary, co-led the research with Mario Motta of IBM Quantum. Motta serves as first author of the paper alongside Claudio Genovese of the International School for Advanced Studies (SISSA) in Italy, Fengjie Ma of Beijing Normal University, Zhi-Hao Cui of the California Institute of Technology, and Randy Sawaya of the University of California, Irvine. Additional co-authors include CCQ co-director Andrew Millis, CCQ Flatiron Research Fellow Hao Shi and CCQ research scientist Miles Stoudenmire.
    The paper’s long author list — 17 co-authors in total — is uncommon for the field, Zhang says. Methods are often developed within individual research groups. The new study brings many methods and research groups together to combine forces and tackle a particularly thorny problem. “The next step in the field is to move toward more realistic problems,” says Zhang, “and there is no shortage of these problems that require collaboration.”
    While conventional methods can explain the properties of some materials, other materials, such as infinite hydrogen chains, pose a more daunting computational hurdle. That’s because the behavior of the electrons in those materials is heavily influenced by interactions between electrons. As electrons interact, they become quantum-mechanically entangled with one another. Once entangled, the electrons can no longer be treated individually, even when they are physically separate.

    advertisement

    The sheer number of electrons in a bulk material — roughly 100 billion trillion per gram — means that conventional brute force methods can’t even come close to providing a solution. The number of electrons is so large that it’s practically infinite when thinking at the quantum scale.
    Thankfully, quantum physicists have developed clever methods of tackling this many-electron problem. The new study combines four such methods: variational Monte Carlo, lattice-regularized diffusion Monte Carlo, auxiliary-field quantum Monte Carlo, and standard and sliced-basis density-matrix renormalization group. Each of these cutting-edge methods has its strengths and weaknesses. Using them in parallel and in concert provides a fuller picture, Zhang says.
    Researchers, including authors of the new study, previously used those methods in 2017 to compute the amount of energy each atom in a hydrogen chain has as a function of the chain’s spacing. This computation, known as the equation of state, doesn’t provide a complete picture of the chain’s properties. By further honing their methods, the researchers did just that.
    At large separations, the researchers found that the electrons remain confined to their respective protons. Even at such large distances, the electrons still ‘know’ about each other and become entangled. Because the electrons can’t hop from atom to atom as easily, the chain acts as an electrical insulator.
    As the atoms move closer together, the electrons try to form molecules of two hydrogen atoms each. Because the protons are fixed in place, these molecules can’t form. Instead, the electrons ‘wave’ to one another, as Zhang puts it. Electrons will lean toward an adjacent atom. In this phase, if you find an electron leaning toward one of its neighbors, you’ll find that neighboring electron responding in return. This pattern of pairs of electrons leaning toward each other will continue in both directions.

    advertisement

    Moving the hydrogen atoms even closer together, the researchers discovered that the hydrogen chain transformed from an insulator into a metal with electrons moving freely between atoms. Under a simple model of interacting particles known as the one-dimensional Hubbard model, this transition shouldn’t happen, as electrons should electrically repel each other enough to restrict movement. In the 1960s, British physicist Nevill Mott predicted the existence of an insulator-to-metal transition based on a mechanism involving so-called excitons, each consisting of an electron trying to break free of its atom and the hole it leaves behind. Mott proposed an abrupt transition driven by the breakup of these excitons — something the new hydrogen chain study didn’t see.
    Instead, the researchers discovered a more nuanced insulator-to-metal transition. As the atoms move closer together, electrons gradually get peeled off the tightly bound inner core around the proton line and become a thin `vapor’ only loosely bound to the line and displaying interesting magnetic structures.
    The infinite hydrogen chain will be a key benchmark in the future in the development of computational methods, Zhang says. Scientists can model the chain using their methods and check their results for accuracy and efficiency against the new study.
    The new work is a leap forward in the quest to utilize computational methods to model realistic materials, the researchers say. In the 1960s, British physicist Neil Ashcroft proposed that metallic hydrogen, for instance, might be a high-temperature superconductor. While the one-dimensional hydrogen chain doesn’t exist in nature (it would crumple into a three-dimensional structure), the researchers say that the lessons they learned are a crucial step forward in the development of the methods and physical understanding needed to tackle even more realistic materials. More

  • in

    Light processing improves robotic sensing, study finds

    A team of Army researchers uncovered how the human brain processes bright and contrasting light, which they say is a key to improving robotic sensing and enabling autonomous agents to team with humans.
    To enable developments in autonomy, a top Army priority, machine sensing must be resilient across changing environments, researchers said.
    “When we develop machine vision algorithms, real-world images are usually compressed to a narrower range, as a cellphone camera does, in a process called tone mapping,” said Andre Harrison, a researcher at the U.S. Army Combat Capabilities Development Command’s Army Research Laboratory. “This can contribute to the brittleness of machine vision algorithms because they are based on artificial images that don’t quite match the patterns we see in the real world.”
    By developing a new system with 100,000-to-1 display capability, the team discovered the brain’s computations, under more real-world conditions, so they could build biological resilience into sensors, Harrison said.
    Current vision algorithms are based on human and animal studies with computer monitors, which have a limited range in luminance of about 100-to-1, the ratio between the brightest and darkest pixels. In the real world, that variation could be a ratio of 100,000-to-1, a condition called high dynamic range, or HDR.
    “Changes and significant variations in light can challenge Army systems — drones flying under a forest canopy could be confused by reflectance changes when wind blows through the leaves, or autonomous vehicles driving on rough terrain might not recognize potholes or other obstacles because the lighting conditions are slightly different from those on which their vision algorithms were trained,” said Army researcher Dr. Chou Po Hung.

    advertisement

    The research team sought to understand how the brain automatically takes the 100,000-to-1 input from the real world and compresses it to a narrower range, which enables humans to interpret shape. The team studied early visual processing under HDR, examining how simple features like HDR luminance and edges interact, as a way to uncover the underlying brain mechanisms.
    “The brain has more than 30 visual areas, and we still have only a rudimentary understanding of how these areas process the eye’s image into an understanding of 3D shape,” Hung said. “Our results with HDR luminance studies, based on human behavior and scalp recordings, show just how little we truly know about how to bridge the gap from laboratory to real-world environments. But, these findings break us out of that box, showing that our previous assumptions from standard computer monitors have limited ability to generalize to the real world, and they reveal principles that can guide our modeling toward the correct mechanisms.”
    The Journal of Vision published the team’s research findings, Abrupt darkening under high dynamic range (HDR) luminance invokes facilitation for high contrast targets and grouping by luminance similarity.
    Researchers said the discovery of how light and contrast edges interact in the brain’s visual representation will help improve the effectiveness of algorithms for reconstructing the true 3D world under real-world luminance, by correcting for ambiguities that are unavoidable when estimating 3D shape from 2D information.
    “Through millions of years of evolution, our brains have evolved effective shortcuts for reconstructing 3D from 2D information,” Hung said. “It’s a decades-old problem that continues to challenge machine vision scientists, even with the recent advances in AI.”
    In addition to vision for autonomy, this discovery will also be helpful to develop other AI-enabled devices such as radar and remote speech understanding that depend on sensing across wide dynamic ranges.
    With their results, the researchers are working with partners in academia to develop computational models, specifically with spiking neurons that may have advantages for both HDR computation and for more power-efficient vision processing — both important considerations for low-powered drones.
    “The issue of dynamic range is not just a sensing problem,” Hung said. “It may also be a more general problem in brain computation because individual neurons have tens of thousands of inputs. How do you build algorithms and architectures that can listen to the right inputs across different contexts? We hope that, by working on this problem at a sensory level, we can confirm that we are on the right track, so that we can have the right tools when we build more complex AIs.” More

  • in

    Fast and efficient method to produce red blood cells developed

    Researchers from Singapore-MIT Alliance for Research and Technology (SMART), MIT’s research enterprise in Singapore, have discovered a new way to manufacture human red blood cells (RBCs) that cuts the culture time by half compared to existing methods and uses novel sorting and purification methods that are faster, more precise and less costly.
    Blood transfusions save millions of lives every year, but over half the world’s countries do not have sufficient blood supply to meet their needs. The ability to manufacture RBCs on demand, especially the universal donor blood (O+), would significantly benefit those in need of transfusion for conditions like leukemia by circumventing the need for large volume blood draws and difficult cell isolation processes.
    Easier and faster manufacturing of RBCs would also have a significant impact on blood banks worldwide and reduce dependence on donor blood which has a higher risk of infection. It is also critical for disease research such as malaria which affects over 220 million people annually, and can even enable new and improved cell therapies.
    However, manufacturing RBCs is time-consuming and creates undesirable by-products, with current purification methods being costly and not optimal for large scale therapeutic applications. SMART’s researchers have thus designed an optimised intermediary cryogenic storage protocol that reduces the cell culture time to 11 days post-thaw, eliminating the need for continuous 23-day blood manufacturing. This is aided by complementary technologies the team developed for highly efficient, low-cost RBC purification and more targeted sorting.
    In a paper titled “Microfluidic label-free bioprocessing of human reticulocytes from erythroid culture” recently published in the journal Lab on a Chip, the researchers explain the huge technical advancements they have made towards improving RBC manufacturing. The study was carried out by researchers from two of SMART’s Interdisciplinary Research Groups (IRGs) — Antimicrobial Resistance (AMR) and Critical Analytics for Manufacturing Personalised-Medicine (CAMP) — co-led by Principal Investigators Jongyoon Han, a Professor at MIT, and Peter Preiser, a Professor at NTU. The team also included AMR and CAMP IRG faculty appointed at the National University of Singapore (NUS) and Nanyang Technological University (NTU).
    “Traditional methods for producing human RBCs usually require 23 days for the cells to grow, expand exponentially and finally mature into RBCs,” says Dr Kerwin Kwek, lead author of the paper and Senior Postdoctoral Associate at SMART CAMP. “Our optimised protocol stores the cultured cells in liquid nitrogen on what would normally be Day 12 in the typical process, and upon demand thaws the cells and produces the RBCs within 11 days.”
    The researchers also developed novel purification and sorting methods by modifying existing Dean Flow Fractionation (DFF) and Deterministic Lateral Displacement (DLD); developing a trapezoidal cross-section design and microfluidic chip for DFF sorting, and a unique sorting system achieved with an inverse L-shape pillar structure for DLD sorting.
    SMART’s new sorting and purification techniques using the modified DFF and DLD methods leverage the RBC’s size and deformability for purification instead of spherical size. As most human cells are deformable, this technique can have wide biological and clinical applications such as cancer cell and immune cell sorting and diagnostics.
    On testing the purified RBCs, they were found to retain their cellular functionality, as demonstrated by high malaria parasite infectivity which requires highly pure and healthy cells for infection. This confirms SMART’s new RBC sorting and purifying technologies are ideal for investigating malaria pathology.
    Compared with conventional cell purification by fluorescence-activated cell sorting (FACS), SMART’s enhanced DFF and DLD methods offer comparable purity while processing at least twice as many cells per second at less than a third of the cost. In scale-up manufacturing processes, DFF is more optimal for its high volumetric throughput, whereas in cases where cell purity is pivotal, DLD’s high precision feature is most advantageous.
    “Our novel sorting and purification methods result in significantly faster cell processing time and can be easily integrated into current cell manufacturing processes. The process also does not require a trained technician to perform sample handling procedures and is scalable for industrial production,” Dr Kwek continues.
    The results of their research would give scientists faster access to final cell products that are fully functional with high purity at a reduced cost of production. More

  • in

    Pandemic spawns 'infodemic' in scientific literature

    The science community has responded to the COVID-19 pandemic with such a flurry of research studies that it is hard for anyone to digest them all, underscoring a long-standing need to make scientific publication more accessible, transparent and accountable, two artificial intelligence experts assert in a data science journal.
    The rush to publish results has resulted in missteps, say Ganesh Mani, an investor, technology entrepreneur and adjunct faculty member in Carnegie Mellon University’s Institute for Software Research, and Tom Hope, a post-doctoral researcher at the Allen Institute for AI. In an opinion article in today’s issue of the journal Patterns, they argue that new policies and technologies are needed to ensure relevant, reliable information is properly recognized.
    Those potential solutions include ways to combine human expertise with AI as one way to keep pace with a knowledge base that is expanding geometrically. AI might be used to summarize and collect research on a topic, while humans serve to curate the findings, for instance.
    “Given the ever-increasing research volume, it will be hard for humans alone to keep pace,” they write.
    In the case of COVID-19 and other new diseases, “you have a tendency to rush things because the clinicians are asking for guidance in treating their patients,” Mani said. Scientists certainly have responded — by mid-August, more than 8,000 preprints of scientific papers related to the novel coronavirus had been posted in online medical, biology and chemistry archives. Even more papers had been posted on such topics as quarantine-induced depression and the impact on climate change from decreased transportation emissions.
    At the same time, the average time to perform peer review and publish new articles has shrunk; in the case of virology, the average dropped from 117 to 60 days.

    advertisement

    This surge of information is what the World Health Organization calls an “infodemic” — an overabundance of information, ranging from accurate to demonstrably false. Not surprisingly, problems such as the hydroxycholoroquine controversy have erupted as research has been rushed to publication and subsequently withdrawn.
    “We’re going to have that same conversation with vaccines,” Mani predicted. “We’re going to have a lot of debates.”
    Problems in scientific publication are nothing new, he said. As a grad student 30 years ago, he proposed an electronic archive for scientific literature that would better organize research and make it easier to find relevant information. Many ideas continue to circulate about how to improve scientific review and publication, but COVID-19 has exacerbated the situation.
    Some of the speed bumps and guard rails that Mani and Hope propose are new policies. For instance, scientists usually emphasize experiments and therapies that work; highlighting negative results, on the other hand, is important for clinicians and discourages other scientists from going down the same blind alleys. Identifying the best reviewers, sharing review comments and linking papers to related papers, retraction sites or legal rulings are among other ideas they explore.
    Greater use of AI to digest and consolidate research is a major focus. Previous attempts to use AI to do so have failed in part because of the often figurative and sometimes ambiguous language used by humans, Mani noted. It may be necessary to write two versions of research papers — one written in a way that draws the attention of people and another written in a boring, uniform style that is more understandable to machines.
    Mani said he and Hope have no illusions that their paper will settle the debate about improving scientific literature, but hope that it will spur changes in time for the next global crisis.
    “Putting such infrastructure in place will help society with the next strategic surprise or grand challenge, which is likely to be equally, if not more, knowledge intensive,” they concluded.

    Story Source:
    Materials provided by Carnegie Mellon University. Original written by Byron Spice. Note: Content may be edited for style and length. More

  • in

    Netflix: A zebra among horses

    Netflix is often criticized as a Hollywood-style entertainment behemoth crushing all competition and diminishing local content, but an academic says that’s a simplistic view. A media studies expert said there is a lot of misunderstanding about the world’s biggest internet-distributed video service which has proved a game-changer for entertainment. Media studies expert Professor Amanda Lotz, from QUT’s Digital Media Research Centre, said there is a lot of misunderstanding about the world’s biggest internet-distributed video service.
    “Netflix must be examined as a zebra among horses,” said Professor Lotz who is in the middle of a three-year Australian Research Council Discovery Project — Internet-distributed television: Cultural, industrial and policy dynamics. She recently published an article in the International Journal of Cultural Studies — ‘In Between the Global and the Local: Mapping the Geographies of Netflix as a Multinational Service.’
    “Few recognize the extent to which Netflix has metamorphosed into a global television service. Unlike services that distribute only US-produced content, Netflix has funded the development of a growing library of series produced in more than 27 countries, across six continents, including Australia.
    “Netflix has regional offices now in Singapore, Amsterdam, and São Paulo. Last year it opened its Australian headquarters in Sydney.”
    Along with QUT’s Distinguished Professor Stuart Cunningham and Dr Ramon Lobato, Senior Research Fellow, RMIT, Professor Lotz is investigating the impact of global subscription video-on-demand platforms on national television markets.
    “Internet-distributed video services such as Netflix, have completely transformed the entertainment landscape and the competitive field in which free-to-air television operates, as well as turned the definition of ‘pay TV’ on its head,” Professor Lotz said.

    advertisement

    “But the Netflix model has been the real gamechanger. Previously, the core business of channels like the BBC, ABC or NBC that commission and pay the lion’s share of production fees for series has been nation bound, even if those shows would someday be available to audiences in many countries.
    “Netflix’s propensity to commission series in multiple countries, and then make them available to the full 150-some million subscribers simultaneously, is unprecedented and something no television channel could do.
    “A local example of this is Hannah Gadsby: Nanettewhich has given the Australian comedian a new global profile. She now has a second Netflix show — Hannah Gadsby: Douglas.
    “And although many believe Netflix competes with the likes of Amazon Prime Video, Apple TV+, Stan and Disney+, none of these services show evidence of supporting multinational production at a scale comparable to Netflix.
    “Our research project has compiled a database of series commissioned by Netflix (in whole or part) and their country of origin. We have found more than half of the titles are produced outside the US and initial analysis of Netflix original films suggests a similar pattern.”
    However, Professor Lotz said Netflix could never develop the depth of content necessary to replace national providers, especially public service broadcasters central to cultural storytelling.

    advertisement

    “It is difficult to appreciate whether some of Netflix’s peculiarity results from its global reach, business model, or distribution technology, but these are crucial questions to ask. And do these characteristics lead to the availability of stories, characters, and places not readily available? If so, this is a notable benefit to audiences,” she said.
    “We should also ask how these characteristics affect opportunities available for writers, producers, and actors who might be rethinking the kind of stories that must be told to sell internationally.
    “Appealing to audiences outside a commissioning channel’s country is increasingly necessary. Even if Netflix is unlikely to eliminate national providers, it is reconfiguring the competitive landscape.”
    Professor Lotz also posted a blog series, Netflix 30 Q&A, in recent months that examines the differences of the SVOD business and how it allows Netflix different program strategies than linear, ad-supported channels.
    “The long term and global rights the company seeks in its commissions have required significant changes in the remuneration norms for those who make its series, and it remains unclear whether the new norms amount to lower pay,” she said.
    “National broadcasters worry about keeping up with the escalating fees Netflix can support for its prestige series’ and complain of an unfair playing field where Netflix isn’t subject to the same local content rules and other requirements.
    “But business and cultural analysts must stop trying to shoehorn Netflix into the same category as linear channels and streaming services aimed at pushing US content abroad. Over its 23-year-existence, Netflix has evolved repeatedly. Perhaps this steady change fuels its misperception.” More

  • in

    New machine learning-assisted method rapidly classifies quantum sources

    For quantum optical technologies to become more practical, there is a need for large-scale integration of quantum photonic circuits on chips.
    This integration calls for scaling up key building blocks of these circuits — sources of particles of light — produced by single quantum optical emitters.
    Purdue University engineers created a new machine learning-assisted method that could make quantum photonic circuit development more efficient by rapidly preselecting these solid-state quantum emitters.
    The work is published in the journal Advanced Quantum Technologies.
    Researchers around the world have been exploring different ways to fabricate identical quantum sources by “transplanting” nanostructures containing single quantum optical emitters into conventional photonic chips.
    “With the growing interest in scalable realization and rapid prototyping of quantum devices that utilize large emitter arrays, high-speed, robust preselection of suitable emitters becomes necessary,” said Alexandra Boltasseva, Purdue’s Ron and Dotty Garvin Tonjes Professor of Electrical and Computer Engineering.

    advertisement

    Quantum emitters produce light with unique, non-classical properties that can be used in many quantum information protocols.
    The challenge is that interfacing most solid-state quantum emitters with existing scalable photonic platforms requires complex integration techniques. Before integrating, engineers need to first identify bright emitters that produce single photons rapidly, on-demand and with a specific optical frequency.
    Emitter preselection based on “single-photon purity” — which is the ability to produce only one photon at a time — typically takes several minutes for each emitter. Thousands of emitters may need to be analyzed before finding a high-quality candidate suitable for quantum chip integration.
    To speed up screening based on single-photon purity, Purdue researchers trained a machine to recognize promising patterns in single-photon emission within a split second.
    According to the researchers, rapidly finding the purest single-photon emitters within a set of thousands would be a key step toward practical and scalable assembly of large quantum photonic circuits.

    advertisement

    “Given a photon purity standard that emitters must meet, we have taught a machine to classify single-photon emitters as sufficiently or insufficiently ‘pure’ with 95% accuracy, based on minimal data acquired within only one second,” said Zhaxylyk Kudyshev, a Purdue postdoctoral researcher.
    The researchers found that the conventional photon purity measurement method used for the same task took 100 times longer to reach the same level of accuracy.
    “The machine learning approach is such a versatile and efficient technique because it is capable of extracting the information from the dataset that the fitting procedure usually ignores,” Boltasseva said.
    The researchers believe that their approach has the potential to dramatically advance most quantum optical measurements that can be formulated as binary or multiclass classification problems.
    “Our technique could, for example, speed up super-resolution microscopy methods built on higher-order correlation measurements that are currently limited by long image acquisition times,” Kudyshev said.

    Story Source:
    Materials provided by Purdue University. Note: Content may be edited for style and length. More

  • in

    Quirky response to magnetism presents quantum physics mystery

    The search is on to discover new states of matter, and possibly new ways of encoding, manipulating, and transporting information. One goal is to harness materials’ quantum properties for communications that go beyond what’s possible with conventional electronics. Topological insulators — materials that act mostly as insulators but carry electric current across their surface — provide some tantalizing possibilities.
    “Exploring the complexity of topological materials — along with other intriguing emergent phenomena such as magnetism and superconductivity — is one of the most exciting and challenging areas of focus for the materials science community at the U.S. Department of Energy’s Brookhaven National Laboratory,” said Peter Johnson, a senior physicist in the Condensed Matter Physics & Materials Science Division at Brookhaven. “We’re trying to understand these topological insulators because they have lots of potential applications, particularly in quantum information science, an important new area for the division.”
    For example, materials with this split insulator/conductor personality exhibit a separation in the energy signatures of their surface electrons with opposite “spin.” This quantum property could potentially be harnessed in “spintronic” devices for encoding and transporting information. Going one step further, coupling these electrons with magnetism can lead to novel and exciting phenomena.
    “When you have magnetism near the surface you can have these other exotic states of matter that arise from the coupling of the topological insulator with the magnetism,” said Dan Nevola, a postdoctoral fellow working with Johnson. “If we can find topological insulators with their own intrinsic magnetism, we should be able to efficiently transport electrons of a particular spin in a particular direction.”
    In a new study just published and highlighted as an Editor’s Suggestion in Physical Review Letters, Nevola, Johnson, and their coauthors describe the quirky behavior of one such magnetic topological insulator. The paper includes experimental evidence that intrinsic magnetism in the bulk of manganese bismuth telluride (MnBi2Te4) also extends to the electrons on its electrically conductive surface. Previous studies had been inconclusive as to whether or not the surface magnetism existed.
    But when the physicists measured the surface electrons’ sensitivity to magnetism, only one of two observed electronic states behaved as expected. Another surface state, which was expected to have a larger response, acted as if the magnetism wasn’t there.

    advertisement

    “Is the magnetism different at the surface? Or is there something exotic that we just don’t understand?” Nevola said.
    Johnson leans toward the exotic physics explanation: “Dan did this very careful experiment, which enabled him to look at the activity in the surface region and identify two different electronic states on that surface, one that might exist on any metallic surface and one that reflected the topological properties of the material,” he said. “The former was sensitive to the magnetism, which proves that the magnetism does indeed exist in the surface. However, the other one that we expected to be more sensitive had no sensitivity at all. So, there must be some exotic physics going on!”
    The measurements
    The scientists studied the material using various types of photoemission spectroscopy, where light from an ultraviolet laser pulse knocks electrons loose from the surface of the material and into a detector for measurement.
    “For one of our experiments, we use an additional infrared laser pulse to give the sample a little kick to move some of the electrons around prior to doing the measurement,” Nevola explained. “It takes some of the electrons and kicks them [up in energy] to become conducting electrons. Then, in very, very short timescales — picoseconds — you do the measurement to look at how the electronic states have changed in response.”
    The map of the energy levels of the excited electrons shows two distinct surface bands that each display separate branches, electrons in each branch having opposite spin. Both bands, each representing one of the two electronic states, were expected to respond to the presence of magnetism.

    advertisement

    To test whether these surface electrons were indeed sensitive to magnetism, the scientists cooled the sample to 25 Kelvin, allowing its intrinsic magnetism to emerge. However only in the non-topological electronic state did they observe a “gap” opening up in the anticipated part of the spectrum.
    “Within such gaps, electrons are prohibited from existing, and thus their disappearance from that part of the spectrum represents the signature of the gap,” Nevola said.
    The observation of a gap appearing in the regular surface state was definitive evidence of magnetic sensitivity — and evidence that the magnetism intrinsic in the bulk of this particular material extends to its surface electrons.
    However, the “topological” electronic state the scientists studied showed no such sensitivity to magnetism — no gap.
    “That throws in a bit of a question mark,” Johnson said.
    “These are properties we’d like to be able to understand and engineer, much like we engineer the properties of semiconductors for a variety of technologies,” Johnson continued.
    In spintronics, for example, the idea is to use different spin states to encode information in the way positive and negative electric charges are presently used in semiconductor devices to encode the “bits” — 1s and 0s — of computer code. But spin-coded quantum bits, or qubits, have many more possible states — not just two. This will greatly expand on the potential to encode information in new and powerful ways.
    “Everything about magnetic topological insulators looks like they’re right for this kind of technological application, but this particular material doesn’t quite obey the rules,” Johnson said.
    So now, as the team continues their search for new states of matter and further insights into the quantum world, there’s a new urgency to explain this particular material’s quirky quantum behavior. More

  • in

    New genetic analysis method could advance personal genomics

    Geneticists could identify the causes of disorders that currently go undiagnosed if standard practices for collecting individual genetic information were expanded to capture more variants that researchers can now decipher, concludes new Johns Hopkins University research.
    The laboratory of Johns Hopkins biomedical engineering professor Alexis Battle has developed a technique to begin identifying potentially problematic rare genetic variants that exist in the genomes of all people, particularly if additional genetic sequencing information was included in standard collection methods. The team’s findings are published in the latest issue of Science and are part of the Genotype-Tissue Expression (GTEx) Program funded by the National Institutes of Health.
    “The implications of this could be quite large. Everyone has around 50,000 variants that are rare in the population and we have absolutely no idea what most of them are doing,” Battle said. “If you collect gene expression data, which shows which proteins are being produced in a patient’s cells at what levels, we’re going to be able to identify what’s going on at a much higher rate.”
    While approximately 8% of U.S. citizens, mostly children, suffer from genetic disorders, the genetic cause has not been found for about half of the cases. What’s even more frustrating, according to Battle, is that even more people are likely living with more subtle genetically-influenced health ailments that have not been identified.
    “We really don’t know how many people are out there walking around with a genetic aberration that is causing them health issues,” she said. “They go completely undiagnosed, meaning we cannot find the genetic cause of their problems.”
    The field of personalized genomics is unable to characterize these rare variants because most genetic variants, specifically variants that are in “non-coding” parts of the genome that do not specify a protein, are not tested. Doing so would represent a major advance in a growing field that is focused on the sequencing and analysis of individuals’ genomes, she said
    The Battle Lab developed a computational system called “Watershed” that can scour reams of genetic data along with gene expression to predict the functions of variants from individual’s genomes. They validated those predictions in the lab and applied the findings to assess the rare variants captured in massive gene collections such as the UK Biobank, the Million Veterans Program and the Jackson Heart Study. The results have helped to show which rare variants may be impacting human traits.
    “Any improvement we can make in this area has implications for public health,” Battle said. “Even pointing to what the genetic cause is gives parents and patients a huge sense of relief and understanding and can point to potential therapeutics.”
    Battle’s team worked in collaboration with researchers from Scripps Translational Science Institute, the New York Genome Center, the Massachusetts Institute of Technology and Stanford, Harvard and Columbia universities.
    “Looking at the cross-tissue transcriptional footprint of rare genetic variants across many human tissues in GTEx data also helps us better understand the gaps and the potential of these analyses for clinical diagnostics,” said Pejman Mohammadi, a co-author and professor of integrative structural and computational biology at Scripps Research.
    The grant numbers involved in the research include: R01MH109905, 1R01HG010480, Searle Scholar Program, R01HG008150.

    Story Source:
    Materials provided by Johns Hopkins University. Note: Content may be edited for style and length. More