More stories

  • in

    New DNA-based chip can be programmed to solve complex math problems

    The field of DNA computing has evolved by leaps and bounds since it was first proposed nearly 30 years ago. But most DNA computing processes are still performed manually, with reactants being added step-by-step to the reaction by hand. Now, finally, scientists at Incheon National University, Korea have found a way to automate DNA calculations by developing a unique chip that can be controlled by a personal computer.
    The term ‘DNA’ immediately calls to mind the double-stranded helix that contains all our genetic information. But the individual units of its two strands are pairs of molecules bonded with each other in a selective, complementary fashion. Turns out, one can take advantage of this pairing property to perform complex mathematical calculations, and this forms the basis of DNA computing.
    Since DNA has only two strands, performing even a simple calculation requires multiple chemical reactions using different sets of DNA. In most existing research, the DNA for each reaction are added manually, one by one, into a single reaction tube, which makes the process very cumbersome. Microfluidic chips, which consist of narrow channels etched onto a material like plastic, offer a way to automate the process. But despite their promise, the use of microfluidic chips for DNA computing remains underexplored.
    In a recent article — made available online in ACS Nano on 7 July 2021 and published in Volume 15 Issue 7 of the journal on 27 July 2021 — a team of scientists from Incheon National University (INU), Korea, present a programmable DNA-based microfluidic chip that can be controlled by a personal computer to perform DNA calculations. “Our hope is that DNA-based CPUs will replace electronic CPUs in the future because they consume less power, which will help with global warming. DNA-based CPUs also provide a platform for complex calculations like deep learning solutions and mathematical modelling,” says Dr. Youngjun Song from INU, who led the study.
    Dr. Song and team used 3D printing to fabricate their microfluidic chip, which can execute Boolean logic, one of the fundamental logics of computer programming. Boolean logic is a type of true-or-false logic that compares inputs and returns a value of ‘true’ or ‘false’ depending on the type of operation, or ‘logic gate,’ used. The logic gate in this experiment consisted of a single-stranded DNA template. Different single-stranded DNA were then used as inputs. If part of an input DNA had a complementary Watson-Crick sequence to the template DNA, it paired to form double-stranded DNA. The output was considered true or false based on the size of the final DNA.
    What makes the designed chip extraordinary is a motor-operated valve system that can be operated using a PC or smartphone. The chip and software set-up together form a microfluidic processing unit (MPU). Thanks to the valve system, the MPU could perform a series of reactions to execute a combination of logic operations in a rapid and convenient manner.
    This unique valve system of the programmable DNA-based MPU paves the way for more complex cascades of reactions that can code for extended functions. “Future research will focus on a total DNA computing solution with DNA algorithms and DNA storage systems,” says Dr. Song.
    Story Source:
    Materials provided by Incheon National University. Note: Content may be edited for style and length. More

  • in

    Finding a metal-oxide needle in a periodic table haystack

    I went to Caltech, and all I got was this T-shirt … and a new way to discover complex and interesting materials.
    Coupling computer automation with an ink-jet printer originally used to print T-shirt designs, researchers at Caltech and Google have developed a high-throughput method of identifying novel materials with interesting properties. In a trial run of the process, they screened hundreds of thousands of possible new materials and discovered one made from cobalt, tantalum, and tin that has tunable transparency and acts as a good catalyst for chemical reactions while remaining stable in strong acid electrolytes.
    The effort, described in a scientific article published in Proceedings of the National Academy of Sciences(PNAS), was led by John Gregoire and Joel Haber of Caltech, and Lusann Yang of Google. It builds on research conducted at the Joint Center for Artificial Photosynthesis (JCAP), a Department of Energy (DOE) Energy Innovation Hub at Caltech, and continues with JCAP’s successor, the Liquid Sunlight Alliance (LiSA), a DOE-funded effort that aims to streamline the complicated steps needed to convert sunlight into fuels, to make that process more efficient.
    Creating new materials is not as simple as dropping a few different elements into a test tube and shaking it up to see what happens. You need the elements that you combine to bond with each other at the atomic level to create something new and different rather than just a heterogeneous mixture of ingredients. With a nearly infinite number of possible combinations of the various squares on the periodic table, the challenge is knowing whichcombinations will yield such a material.
    “Materials discovery can be a bleak process. If you can’t predict where to find the desired properties, you could spend your entire career mixing random elements and never find anything interesting,” says Gregoire, research professor of applied physics and materials science, researcher at JCAP, and LiSA team lead.
    When combining a small number of individual elements, materials scientists can often make predictions about what properties a new material might have based on its constituent parts. However, that process quickly becomes untenable when more complicated mixtures are made. More

  • in

    Engineers create 3D-printed objects that sense how a user is interacting with them

    MIT researchers have developed a new method to 3D print mechanisms that detect how force is being applied to an object. The structures are made from a single piece of material, so they can be rapidly prototyped. A designer could use this method to 3D print “interactive input devices,” like a joystick, switch, or handheld controller, in one go.
    To accomplish this, the researchers integrated electrodes into structures made from metamaterials, which are materials divided into a grid of repeating cells. They also created editing software that helps users build these interactive devices.
    “Metamaterials can support different mechanical functionalities. But if we create a metamaterial door handle, can we also know that the door handle is being rotated, and if so, by how many degrees? If you have special sensing requirements, our work enables you to customize a mechanism to meet your needs,” says co-lead author Jun Gong, a former visiting PhD student at MIT who is now a research scientist at Apple.
    Gong wrote the paper alongside fellow lead authors Olivia Seow, a graduate student in the MIT Department of Electrical Engineering and Computer Science (EECS), and Cedric Honnet, a research assistant in the MIT Media Lab. Other co-authors are MIT graduate student Jack Forman and senior author Stefanie Mueller, who is an associate professor in EECS and a member of the Computer Science and Artificial Intelligence Laboratory (CSAIL). The research will be presented at the Association for Computing Machinery Symposium on User Interface Software and Technology next month.
    “What I find most exciting about the project is the capability to integrate sensing directly into the material structure of objects. This will enable new intelligent environments in which our objects can sense each interaction with them,” Mueller says. “For instance, a chair or couch made from our smart material could detect the user’s body when the user sits on it and either use it to query particular functions (such as turning on the light or TV) or to collect data for later analysis (such as detecting and correcting body posture).”
    Embedded electrodes
    Because metamaterials are made from a grid of cells, when the user applies force to a metamaterial object, some of the flexible, interior cells stretch or compress. More

  • in

    Scientists can now assemble entire genomes on their personal computers in minutes

    Scientists at the Massachusetts Institute of Technology (MIT) and the Institut Pasteur in France have developed a technique for reconstructing whole genomes, including the human genome, on a personal computer. This technique is about a hundred times faster than current state-of-the-art approaches and uses one-fifth the resources. The study, published September 14 in the journal Cell Systems, allows for a more compact representation of genome data inspired by the way in which words, rather than letters, offer condensed building blocks for language models.
    “We can quickly assemble entire genomes and metagenomes, including microbial genomes, on a modest laptop computer,” says Bonnie Berger (@lab_berger), the Simons Professor of Mathematics at the Computer Science and AI Lab at MIT and an author of the study. “This ability is essential in assessing changes in the gut microbiome linked to disease and bacterial infections, such as sepsis, so that we can more rapidly treat them and save lives.”
    Genome assembly projects have come a long way since the Human Genome Project, which finished assembling the first complete human genome in 2003 for the cost of about $2.7 billion and more than a decade of international collaboration. But while human genome assembly projects no longer take years, they still require several days and massive computer power. Third-generation sequencing technologies offer terabytes of high-quality genomic sequences with tens of thousands of base pairs, yet genome assembly using such an immense quantity of data has proved challenging.
    To approach genome assembly more efficiently than current techniques, which involve making pairwise comparisons between all possible pairs of reads, Berger and colleagues turned to language models. Building from the concept of a de Bruijn graph, a simple, efficient data structure used for genome assembly, the researchers developed a minimizer-space de Bruin graph (mdBG), which uses short sequences of nucleotides called minimizers instead of single nucleotides.
    “Our minimizer-space de Bruijn graphs store only a small fraction of the total nucleotides, while preserving the overall genome structure, enabling them to be orders of magnitude more efficient than classical de Bruijn graphs,” says Berger.
    The researchers applied their method to assemble real HiFi data (which has almost perfect single-molecule read accuracy) for Drosophila melanogaster fruit flies, as well as human genome data provided by Pacific Biosciences (PacBio). When they evaluated the resulting genomes, Berger and colleagues found that their mdBG-based software required about 33 times less time and 8 times less random-access memory (RAM) computing hardware than other genome assemblers. Their software performed genome assembly for the HiFi human data 81 times faster with 18 times less memory usage than the Peregrine assembler and 338 times faster with 19 times less memory usage than the hifiasm assembler. More

  • in

    New ocean temperature data help scientists make their hot predictions

    We’ve heard that rising temperatures will lead to rising sea levels, but what many may not realise is that most of the increase in energy in the climate system is occurring in the ocean.
    Now a study from UNSW Sydney and CSIRO researchers has shown that a relatively new ocean temperature measuring program — the Argo system of profiling floats — can help tell us which climate modelling for the 21st century we should be paying attention to the most.
    Professor John Church from UNSW’s Climate Change Research Centre in the School of Biological, Earth and Environmental Sciences says the study published today in Nature Climate Change is an attempt to narrow the projected range of future ocean temperature rises to the end of the 21st century using model simulations that are most consistent with the Argo’s findings in the years 2005 to 2019.
    “The models that projected very high absorption of heat by the ocean by 2100 also have unrealistically high ocean absorption over the Argo period of measurement,” Prof. Church says.
    “Likewise, there are models with lower heat absorption in the future that also don’t correspond to the Argo data. So we have effectively used the Argo observations to say, ‘which of these models best agree with the observations and therefore constrain projections for the future?'”
    Named after the boat which Greek mythological hero Jason travelled on in search of the golden fleece, the Argo floats are loaded with high-tech equipment that measures ocean temperatures to depths of up to 2000 metres. More

  • in

    Taking lessons from a sea slug, study points to better hardware for artificial intelligence

    For artificial intelligence to get any smarter, it needs first to be as intelligent as one of the simplest creatures in the animal kingdom: the sea slug.
    A new study has found that a material can mimic the sea slug’s most essential intelligence features. The discovery is a step toward building hardware that could help make AI more efficient and reliable for technology ranging from self-driving cars and surgical robots to social media algorithms.
    The study, publishing this week in the Proceedings of the National Academy of Sciences, was conducted by a team of researchers from Purdue University, Rutgers University, the University of Georgia and Argonne National Laboratory.
    “Through studying sea slugs, neuroscientists discovered the hallmarks of intelligence that are fundamental to any organism’s survival,” said Shriram Ramanathan, a Purdue professor of materials engineering. “We want to take advantage of that mature intelligence in animals to accelerate the development of AI.”
    Two main signs of intelligence that neuroscientists have learned from sea slugs are habituation and sensitization. Habituation is getting used to a stimulus over time, such as tuning out noises when driving the same route to work every day. Sensitization is the opposite — it’s reacting strongly to a new stimulus, like avoiding bad food from a restaurant.
    AI has a really hard time learning and storing new information without overwriting information it has already learned and stored, a problem that researchers studying brain-inspired computing call the “stability-plasticity dilemma.” Habituation would allow AI to “forget” unneeded information (achieving more stability) while sensitization could help with retaining new and important information (enabling plasticity). More

  • in

    Just by changing its shape, scientists show they can alter material properties

    By confining the transport of electrons and ions in a patterned thin film, scientists find a way to potentially enhance material properties for design of next-generation electronics
    Like ripples in a pond, electrons travel like waves through materials, and when they collide and interact, they can give rise to new and interesting patterns.
    Scientists at the U.S. Department of Energy’s (DOE) Argonne National Laboratory have seen a new kind of wave pattern emerge in a thin film of metal oxide known as titania when its shape is confined. Confinement, the act of restricting materials within a boundary, can alter the properties of a material and the movement of molecules through it.
    In the case of titania, it caused electrons to interfere with each other in a unique pattern, which increased the oxide’s conductivity, or the degree to which it conducts electricity. This all happened at the mesoscale, a scale where scientists can see both quantum effects and the movement of electrons and molecules.
    In all, this work offers scientists more insight about how atoms, electrons and other particles behave at the quantum level. Such information could aid in designing new materials that can process information and be useful in other electronic applications.
    “What really set this work apart was the size of the scale we investigated,” said lead author Frank Barrows, a Northwestern University graduate student in Argonne’s Materials Science Division (MSD). “Investigating at this unique length scale enabled us to see really interesting phenomena that indicate there is interference happening at the quantum level, and at the same time gain new information about how electrons and ions interact.”
    Altering geometry to change material properties More

  • in

    Do Alexa and Siri make kids bossier? New research suggests you might not need to worry

    Chatting with a robot is now part of many families’ daily lives, thanks to conversational agents such as Apple’s Siri or Amazon’s Alexa. Recent research has shown that children are often delighted to find that they can ask Alexa to play their favorite songs or call Grandma.
    But does hanging out with Alexa or Siri affect the way children communicate with their fellow humans? Probably not, according to a recent study led by the University of Washington that found that children are sensitive to context when it comes to these conversations.
    The team had a conversational agent teach 22 children between the ages of 5 and 10 to use the word “bungo” to ask it to speak more quickly. The children readily used the word when a robot slowed down its speech. While most children did use bungo in conversations with their parents, it became a source of play or an inside joke about acting like a robot. But when a researcher spoke slowly to the children, the kids rarely used bungo, and often patiently waited for the researcher to finish talking before responding.
    The researchers published their findings in June at the 2021 Interaction Design and Children conference.
    “We were curious to know whether kids were picking up conversational habits from their everyday interactions with Alexa and other agents,” said senior author Alexis Hiniker, a UW assistant professor in the Information School. “A lot of the existing research looks at agents designed to teach a particular skill, like math. That’s somewhat different from the habits a child might incidentally acquire by chatting with one of these things.”
    The researchers recruited 22 families from the Seattle area to participate in a five-part study. This project took place before the COVID-19 pandemic, so each child visited a lab with one parent and one researcher. For the first part of the study, children spoke to a simple animated robot or cactus on a tablet screen that also displayed the text of the conversation. More