More stories

  • in

    Power of DNA to store information gets an upgrade

    A team of interdisciplinary researchers has discovered a new technique to store in DNA information — in this case “The Wizard of Oz,” translated into Esperanto — with unprecedented accuracy and efficiency. The technique harnesses the information-storage capacity of intertwined strands of DNA to encode and retrieve information in a way that is both durable and compact.
    The technique is described in a paper in this week’s Proceedings of the National Academy of Sciences.
    “The key breakthrough is an encoding algorithm that allows accurate retrieval of the information even when the DNA strands are partially damaged during storage,” said Ilya Finkelstein, an associate professor of molecular biosciences and one of the authors of the study.
    Humans are creating information at exponentially higher rates than we used to, contributing to the need for a way to store more information efficiently and in a way that will last a long time. Companies such as Google and Microsoft are among those exploring using DNA to store information.
    “We need a way to store this data so that it is available when and where it’s needed in a format that will be readable,” said Stephen Jones, a research scientist who collaborated on the project with Finkelstein; Bill Press, a professor jointly appointed in computer science and integrative biology; and Ph.D. alumnus John Hawkins. “This idea takes advantage of what biology has been doing for billions of years: storing lots of information in a very small space that lasts a long time. DNA doesn’t take up much space, it can be stored at room temperature, and it can last for hundreds of thousands of years.”
    DNA is about 5 million times more efficient than current storage methods. Put another way, a one milliliter droplet of DNA could store the same amount of information as two Walmarts full of data servers. And DNA doesn’t require permanent cooling and hard disks that are prone to mechanical failures.
    There’s just one problem: DNA is prone to errors. And when a genetic code has errors, it’s a lot different from when a computer code has errors. Errors in computer codes tend to show up as blank spots in the code. Errors in DNA sequences show up as insertions or deletions. The problem there is that when something is deleted or added in DNA, the whole sequence shifts, with no blank spots to alert anyone.
    Previously, when information was stored in DNA, the piece of information that needed to be saved, such as a paragraph from a novel, would be repeated 10 to 15 times. When the information was read, the repetitions would be compared to eliminate any insertions or deletions.
    “We found a way to build the information more like a lattice,” Jones said. “Each piece of information reinforces other pieces of information. That way, it only needs to be read once.”
    The language the researchers developed also avoids sections of DNA that are prone to errors or that are difficult to read. The parameters of the language can also change with the type of information that is being stored. For instance, a dropped word in a novel is not as big a deal as a dropped zero in a tax return.
    To demonstrate information retrieval from degraded DNA, the team subjected its “Wizard of Oz” code to high temperatures and extreme humidity. Even though the DNA strands were damaged by these harsh conditions, all the information was still decoded successfully.
    “We tried to tackle as many problems with the process as we could at the same time,” said Hawkins, who recently was with UT’s Oden Institute for Computational Engineering and Sciences. “What we ended up with is pretty remarkable.”

    Story Source:
    Materials provided by University of Texas at Austin. Note: Content may be edited for style and length. More

  • in

    Social media inspired models show winter warming hits fish stocks

    Mathematical modelling inspired by social media is identifying the significant impacts of warming seas on the world’s fisheries.
    University of Queensland School of Veterinary Science researcher Dr Nicholas Clark and colleagues from the University of Otago and James Cook University have assembled a holistic picture of climate change’s impacts on fish stocks in the Mediterranean Sea.
    “Usually, when modelling ecosystems to understand how nature is changing, we build models that only focus on the effects of the environment,” Dr Clark said.
    “But it’s just not accurate enough.
    “Newer models — commonly used in social media to document people’s social interactions — offer an exciting way to address this gap in scientific knowledge.
    “These innovative network models give us a more accurate picture of reality by incorporating biology, allowing us to ask how one species responds to both environmental change and to the presence of other species, including humans.”
    The team used this technique to analyse fish populations in the Mediterranean Sea, a fisheries-based biodiversity hotspot with its future under threat from rapidly warming seas.

    advertisement

    “Experts from fisheries, ecology and the geographical sciences have compiled decades of research to describe the geographical ranges for more than 600 Mediterranean fish species,” Dr Clark said.
    “We put this information, along with data from the Intergovernmental Panel on Climate Change’s sophisticated climate models into our network model.
    “We found that warming seas — particularly in winter — have widespread effects on fish biodiversity.”
    The University of Otago’s Associate Professor Ceridwen Fraser said winter warming was often overlooked when people thought about the impacts of climate change.
    “A great deal of research and media attention has been on the impacts of extreme summer temperatures on people and nature, but winters are getting warmer too,” Dr Fraser said.

    advertisement

    “Interestingly, coastal water temperatures are expected to increase at a faster rate in winter than in summer.
    “Even though winter warming might not reach the extreme high temperatures of summer heatwaves, this research shows that warmer winters could also lead to ecosystem disruption, in some cases more than hotter summer warming will.
    “Our results suggest that winter warming will cause fish species to hang out together in different ways, and some species will disappear from some areas entirely.”
    The researchers hope the study will emphasise the need to understand and address climate change.
    “If fish communities are more strongly regulated by winter temperatures as our model suggests, this means that fish diversity may change more quickly than we previously thought,” Dr Clark said.
    “Catches for many bottom-dwelling and open-ocean fishery species in the Mediterranean Sea have been steadily declining, so any changes to fish communities may have widespread economic impacts.
    “For the sake of marine ecosystems and the people whose livelihoods depend on them, we need to gain a better understanding of how ocean warming will influence both species and economies.”

    Story Source:
    Materials provided by University of Queensland. Note: Content may be edited for style and length. More

  • in

    For next-generation semiconductors, 2D tops 3D

    Netflix, which provides an online streaming service around the world, has 42 million videos and about 160 million subscribers in total. It takes just a few seconds to download a 30-minute video clip and you can watch a show within 15 minutes after it airs. As distribution and transmission of high-quality contents are growing rapidly, it is critical to develop reliable and stable semiconductor memories.
    To this end, POSTECH research team has developed a memory device using a two-dimensional layered-structure material, unlocking the possibility of commercializing the next-generation memory device that can be stably operated at a low power.
    POSTECH research team consisting of Professor Jang-Sik Lee of the Department of Materials Science and Engineering, Professor Donghwa Lee of the Division of Advanced Materials Science, Youngjun Park, and Seong Hun Kim in the PhD course succeeded in designing an optimal halide perovskite material (CsPb2Br5) that can be applied to a ReRAM*1 device by applying the first-principles calculation*2 based on quantum mechanics. The findings were published in Advanced Science.
    The ideal next-generation memory device should process information at high speeds, store large amounts of information with non-volatile characteristics where the information does not disappear when power is off, and operate at low power for mobile devices.
    The recent discovery of the resistive switching property in halide perovskite materials has led to worldwide active research to apply them to ReRAM devices. However, the poor stability of halide perovskite materials when they are exposed to the atmosphere have been raised as an issue.
    The research team compared the relative stability and properties of halide perovskites with various structures using the first principles calculation2. DFT calculations predicted that CsPb2Br5, a two-dimensional layered structure in the form of AB2X5, may have better stability than the three-dimensional structure of ABX3 or other structures (A3B2X7, A2BX4), and that this structure could show improved performance in memory devices.
    To verify this result, CsPb2Br5, an inorganic perovskite material with a two-dimensional layered structure, was synthesized and applied to memory devices for the first time. The memory devices with a three-dimensional structure of CsPbBr3 lost their memory characteristics at temperatures higher than 100 °C. However, the memory devices using a two-dimensional layered-structure of CsPb2Br5 maintained their memory characteristics over 140 °C and could be operated at voltages lower than 1V.
    Professor Jang-Sik Lee who led the research commented, “Using this materials-designing technique based on the first-principles screening and experimental verification, the development of memory devices can be accelerated by reducing the time spent on searching for new materials. By designing an optimal new material for memory devices through computer calculations and applying it to actually producing them, the material can be applied to memory devices of various electronic devices such as mobile devices that require low power consumption or servers that require reliable operation. This is expected to accelerate the commercialization of next-generation data storage devices.”

    Story Source:
    Materials provided by Pohang University of Science & Technology (POSTECH). Note: Content may be edited for style and length. More

  • in

    Electron cryo-microscopy: Using inexpensive technology to produce high-resolution images

    Biochemists at Martin Luther University Halle-Wittenberg (MLU) have used a standard electron cryo-microscope to achieve surprisingly good images that are on par with those taken by far more sophisticated equipment. They have succeeded in determining the structure of ferritin almost at the atomic level. Their results were published in the journal PLOS ONE.
    Electron cryo-microscopy has become increasingly important in recent years, especially in shedding light on protein structures. The developers of the new technology were awarded the Nobel Prize for Chemistry in 2017. The trick: the samples are flash frozen and then bombarded with electrons. In the case of traditional electron microscopy, all of the water is first extracted from the sample. This is necessary because the investigation takes place in a vacuum, which means water would evaporate immediately and make imaging impossible. However, because water molecules play such an important role in biomolecules, especially in proteins, they cannot be examined using traditional electron microscopy. Proteins are among the most important building blocks of cells and perform a variety of tasks. In-depth knowledge of their structure is necessary in order to understand how they work.
    The research group led by Dr Panagiotis Kastritis, who is a group leader at the Centre for Innovation Competence HALOmem and a junior professor at the Institute of Biochemistry and Biotechnology at MLU, acquired a state-of-the-art electron cryo-microscope in 2019. “There is no other microscope like it in Halle,” says Kastritis. The new “Thermo Fisher Glacios 200 kV,” financed by the Federal Ministry of Education and Research, is not the best and most expensive microscope of its kind. Nevertheless, Kastritis and his colleagues succeeded in determining the structure of the iron storage protein apoferritin down to 2.7 ångströms (Å), in other words, almost down to the individual atom. One ångström equals one-tenth of a nanometre. This puts the research group in a similar league to departments with far more expensive equipment. Apoferritin is often used as a reference protein to determine the performance levels of corresponding microscopes. Just recently, two research groups broke a new record with a resolution of about 1.2 Å. “Such values can only be achieved using very powerful instruments, which only a handful of research groups around the world have at their disposal. Our method is designed for microscopes found in many laboratories,” explains Kastritis.
    Electron cryo-microscopes are very complex devices. “Even tiny misalignments can render the images useless,” says Kastritis. It is important to programme them correctly and Halle has the technical expertise to do this. But the analysis that is conducted after the data has been collected is just as important. “The microscope produces several thousand images,” explains Kastritis. Image processing programmes are used to create a 3D structure of the molecule. In cooperation with Professor Milton T. Stubbs from the Institute of Biochemistry and Biotechnology at MLU, the researchers have developed a new method to create a high-resolution model of a protein. Stubbs’ research group uses X-ray crystallography, another technique for determining the structure of proteins, which requires the proteins to be crystallised. They were able to combine a modified form of an image analysis technique with the images taken with the electron cryo-microscope. This made charge states and individual water molecules visible.
    “It’s an attractive method,” says Kastritis. Instead of needing very expensive microscopes, a lot of computing capacity is required, which MLU has. Now, in addition to using X-ray crystallography, electron cryo-microscopy can be used to produce images of proteins — especially those that are difficult to crystallise. This enables collaboration, both inside and outside the university, on the structural analysis of samples with medical and biotechnological potential.

    Story Source:
    Materials provided by Martin-Luther-Universität Halle-Wittenberg. Note: Content may be edited for style and length. More

  • in

    New materials for extra thin computer chips

    Ever smaller and ever more compact — this is the direction in which computer chips are developing, driven by industry. This is why so-called 2D materials are considered to be the great hope: they are as thin as a material can possibly be, in extreme cases they consist of only one single layer of atoms. This makes it possible to produce novel electronic components with tiny dimensions, high speed and optimal efficiency.
    However, there is one problem: electronic components always consist of more than one material. 2D materials can only be used effectively if they can be combined with suitable material systems — such as special insulating crystals. If this is not considered, the advantage that 2D materials are supposed to offer is nullified. A team from the Faculty of Electrical Engineering at the TU Wien (Vienna) is now presenting these findings in the journal Nature Communications.
    Reaching the End of the Line on the Atomic Scale
    “The semiconductor industry today uses silicon and silicon oxide,” says Prof. Tibor Grasser from the Institute of Microelectronics at the TU Wien. “These are materials with very good electronic properties. For a long time, ever thinner layers of these materials were used to miniaturize electronic components. This worked well for a long time — but at some point we reach a natural limit.”
    When the silicon layer is only a few nanometers thick, so that it only consists of a few atomic layers, then the electronic properties of the material deteriorate very significantly. “The surface of a material behaves differently from the bulk of the material — and if the entire object is practically only made up of surfaces and no longer has a bulk at all, it can have completely different material properties.”
    Therefore, one has to switch to other materials in order to create ultra-thin electronic components. And this is where the so-called 2D materials come into play: they combine excellent electronic properties with minimal thickness.
    Thin layers need Thin Insulators
    “As it turns out, however, these 2D materials are only the first half of the story,” says Tibor Grasser. “The materials have to be placed on the appropriate substrate, and an insulator layer is also needed on top of it — and this insulator also hast to be extremely thin and of extremely good quality, otherwise you have gained nothing from the 2D materials. It’s like driving a Ferrari on muddy ground and wondering why you don’t set a speed record.”
    A team at the TU Wien around Tibor Grasser and Yury Illarionov has therefore analysed how to solve this problem. “Silicon dioxide, which is normally used in industry as an insulator, is not suitable in this case,” says Tibor Grasser. “It has a very disordered surface and many free, unsaturated bonds that interfere with the electronic properties in the 2D material.”
    It is better to look for a well-ordered structure: The team has already achieved excellent results with special crystals containing fluorine atoms. A transistor prototype with a calcium fluoride insulator has already provided convincing data, and other materials are still being analysed.
    “New 2D materials are currently being discovered. That’s nice, but with our results we want to show that this alone is not enough,” says Tibor Grasser. “These new electrically conductive 2D materials must also be combined with new types of insulators. Only then can we really succeed in producing a new generation of efficient and powerful electronic components in miniature format.”

    Story Source:
    Materials provided by Vienna University of Technology. Original written by Florian Aigner. Note: Content may be edited for style and length. More

  • in

    A micro-lab on a chip detects blood type within minutes

    Blood transfusion, if performed promptly, is a potentially life-saving intervention for someone losing a lot of blood. However, blood comes in several types, some of which are incompatible with others. Transfusing an incompatible blood type can severely harm a patient. It is, therefore, critical for medical staff to know a patient’s blood type before they perform a transfusion.
    There are four major blood types — O, A, B, and AB. These types differ based on the presence or absence of structures called A antigens and B antigens on the surfaces of red blood cells. Blood can be further divided into positive and negative types based on the presence or absence of D antigens on red blood cells. Medical professionals usually tell a patient’s blood type with tests involving antibodies against the A and B antigens. When antibodies recognize the corresponding antigens, they bind to them, causing the blood cells to clump together and the blood to coagulate. Thus, specific antigen-antibody combinations tell us what the blood type of a blood sample is.
    Yet, while the concept sounds straightforward, the equipment and techniques required are often very specialized. Tests, therefore, are non-portable, have high personnel cost, and can take over half an hour to yield results. This can prove problematic in several types of emergency situations.
    Aiming to solve these problems, a team of scientists at Japan’s Tokyo University of Science, led by Dr Ken Yamamoto and Dr Masahiro Motosuke, has developed a fully automated chip that can quickly and reliably determine a patient’s blood type. In the words of Dr Motosuke, he and his colleagues “have developed a compact and rapid blood-typing chip which also dilutes whole blood automatically.”
    The chip contains a micro-sized “laboratory” with various compartments through which the blood sample travels in sequence and is processed until results are obtained. To start the process, a user simply inserts a small amount of blood, presses a button, and waits for the result. Inside the chip, the blood is first diluted with a saline solution and air bubbles are introduced to promote mixing. The diluted blood is transported to a homogenizer where further mixing, driven by more intensely moving bubbles, yields a uniform solution. Portions of the homogenized blood solution are introduced into four different detector chambers. Two chambers each contain reagents that can detect either A antigens or B antigens. A third chamber contains reagents that detect D antigens and a fourth chamber contains only saline solution, with no reagent, and serves as a negative control chamber in which the user should not observe any results. Antigen-antibody reaction will cause blood to coagulate, and by looking at which chambers have coagulated blood, the user can tell the blood type and whether the blood is positive or negative.
    Further, the user does not require specialized optical equipment to read the results. The design of the detector chambers allows the easy identification of coagulated blood with the naked eye. The device is also highly sensitive and can even detect weak coagulation.
    During testing, the research team screened blood samples from 10 donors and obtained accurate results for all 10 samples. The time needed to determine a single sample’s blood type was only five minutes.
    Reflecting on the potential benefits of his team’s invention, Dr Motosuke remarks, “The advancement of simple and quick blood test chip technologies will lead to the simplification of medical care in emergency situations and will greatly reduce costs and the necessary labor on parts of medical staff.” Given the highly portable nature of the chip, Professor Motosuke also speculates that it could be used during aerial medical transport and in disaster response settings. This is a chip that has the potential to change the way emergency medical support is given.

    Story Source:
    Materials provided by Tokyo University of Science. Note: Content may be edited for style and length. More

  • in

    Janggu makes deep learning a breeze

    Researchers from the MDC have developed a new tool that makes it easier to maximize the power of deep learning for studying genomics. They describe the new approach, Janggu, in the journal Nature Communications.
    Imagine that before you could make dinner, you first had to rebuild the kitchen, specifically designed for each recipe. You’d spend way more time on preparation, than actually cooking. For computational biologists, it’s been a similar time-consuming process for analyzing genomics data. Before they can even begin their analysis, they spend a lot of valuable time formatting and preparing huge data sets to feed into deep learning models.
    To streamline this process, researchers from the Max Delbrueck Center for Molecular Medicine in the Helmholtz Association (MDC) developed a universal programming tool that converts a wide variety of genomics data into the required format for analysis by deep learning models. “Before, you ended up wasting a lot of time on the technical aspect, rather than focusing on the biological question you were trying to answer,” says Dr. Wolfgang Kopp, a scientist in the Bioinformatics and Omics Data Science research group at MDC’s Berlin Institute of Medical Systems Biology (BIMSB), and first author of the paper. “With Janggu, we are aiming to relieve some of that technical burden and make it accessible to as many people as possible.”
    Unique name, universal solution
    Janggu is named after a traditional Korean drum shaped like an hourglass turned on its side. The two large sections of the hourglass represent the areas Janggu is focused: pre-processing of genomics data, results visualization and model evaluation. The narrow connector in the middle represents a placeholder for any type of deep learning model researchers wish to use.
    Deep learning models involve algorithms sorting through massive amounts data and finding relevant features or patterns. While deep learning is a very powerful tool, its use in genomics has been limited. Most published models tend to only work with fixed types of data, able to answer only one specific question. Swapping out or adding new data often requires starting over from scratch and extensive programming efforts.

    advertisement

    Janggu converts different genomics data types into a universal format that can be plugged into any machine learning or deep learning model that uses python, a widely-used programming language.
    “What makes our approach special is that you can easily use any genomic data set for your deep learning problem, anything goes in any format,” Dr. Altuna Akalin, who heads the Bioinformatics and Omics Data Science research group.
    Separation is key
    Akalin’s research group has a dual mission: developing new machine learning tools, and using them to investigate questions in biology and medicine. During their own research efforts, they were continually frustrated by how much time was spent formatting data. They realized part of the problem was each deep learning model included its own data pre-processing. By separating the data extraction and formatting from the analysis, it provides a much easier way to interchange, combine or reuse sections of data. It’s kind of like having all the kitchen tools and ingredients at your fingertips ready to try out a new recipe.
    “The difficulty was finding the right balance between flexibility and usability,” Kopp says. “If it is too flexible, people will be drowned in different options and it will be difficult to get started.”
    Kopp has prepared several tutorials to help others begin using Janggu, along with example datasets and case studies. The Nature Communications paper demonstrates Janggu’s versatility in handling very large volumes of data, combining data streams, and answering different types of questions, such as predicting binding sites from DNA sequences and/or chromatin accessibility, as well as for classification and regression tasks.
    Endless applications
    While most of Janggu’s benefit is on the front end, the researchers wanted to provide a complete solution for deep learning. Janggu also includes visualization of results after the deep learning analysis, and evaluates what the model has learned. Notably, the team incorporated “higher-order sequence encoding” into the package, which allows to capture correlations between neighboring nucleotides. This helped to increase accuracy of some analyses. By making deep learning easier and more user-friendly, Janggu helps throw open the door to answering all kinds of biological questions.
    “One of the most interesting applications is predicting the effect of mutations on gene regulation,” Akalin says. “This is exciting because now we can start understanding individual genomes, for instance, we can pinpoint genetic variants that cause regulatory changes, or we can interpret regulatory mutations occurring in tumors.” More

  • in

    New research shows that laser spectral linewidth is classical-physics phenomenon

    New ground-breaking research from the University of Surrey could change the way scientists understand and describe lasers — establishing a new relationship between classical and quantum physics.
    In a comprehensive study published by the journal Progress in Quantum Electronics, a researcher from Surrey, in partnership with a colleague from Karlsruhe Institute of Technology and Fraunhofer IOSB in Germany, calls into question 60 years of orthodoxy surrounding the principles of lasers and the laser spectral linewidth — the foundation for controlling and measuring wavelengths of light.
    In the new study, the researchers find that a fundamental principle of lasers, that the amplification of light compensates for the losses of the laser, is only an approximation. The team quantify and explain that a tiny excess loss, which is not balanced by the amplified light but by normal luminescence inside the laser, provides the answer to the spectral linewidth of the laser.
    One of these loss mechanisms, the outcoupling of light from the laser, produces the laser beam used in vehicle manufacturing, telecommunications, laser surgery, GPS and so much more.
    Markus Pollnau, Professor in Photonics at the University of Surrey, said: “Since the laser was invented in 1960, the laser spectral linewidth has been treated as the stepchild in the descriptions of lasers in textbooks and university teaching worldwide, because its quantum-physical explanation has placed extraordinary challenges even for the lecturers.
    “As we have explained in this study, there is a simple, easy-to-understand derivation of the laser spectral linewidth, and the underlying classical physics proves the quantum-physics attempt of explaining the laser spectral linewidth hopelessly incorrect. This result has fundamental consequences for quantum physics.”

    Story Source:
    Materials provided by University of Surrey. Original written by Dalitso Njolinjo. Note: Content may be edited for style and length. More