More stories

  • in

    A statistical fix for archaeology's dating problem

    Archaeologists have long had a dating problem. The radiocarbon analysis typically used to reconstruct past human demographic changes relies on a method easily skewed by radiocarbon calibration curves and measurement uncertainty. And there’s never been a statistical fix that works — until now.
    “Nobody has systematically explored the problem, or shown how you can statistically deal with it,” says Santa Fe Insitute archaeologist Michael Price, lead author on a paper in the Journal of Archaeological Science about a new method he developed for summarizing sets of radiocarbon dates. “It’s really exciting how this work came together. We identified a fundamental problem and fixed it.”
    In recent decades, archaeologists have increasingly relied on sets of radiocarbon dates to reconstruct past population size through an approach called “dates as data.” The core assumption is that the number of radiocarbon samples from a given period is proportional to the region’s population size at that time. Archaeologists have traditionally used “summed probability densities,” or SPDs, to summarize these sets of radiocarbon dates. “But there are a lot of inherent issues with SPDs,” says Julie Hoggarth, Baylor University archaeologist and a co-author on the paper.
    Radiocarbon dating measures the decay of carbon-14 in organic matter. But the amount of carbon-14 in the atmosphere fluctuates through time; it’s not a constant baseline. So researchers create radiocarbon calibration curves that map the carbon-14 values to dates. Yet a single carbon-14 value can correspond to different dates — a problem known as “equifinality,” which can naturally bias the SPD curves. “That’s been a major issue,” and a hurdle for demographic analyses, says Hoggarth. “How do you know that the change you’re looking at is an actual change in population size, and it isn’t a change in the shape of the calibration curve?”
    When she discussed the problem with Price several years ago, he told her he wasn’t a fan of SPDs, either. She asked what archaeologists should do instead. “Essentially, he said, ‘Well, there is no alternative.'”
    That realization led to a years-long quest. Price has developed an approach to estimating prehistoric populations that uses Bayesian reasoning and a flexible probability model that allows researchers to overcome the problem of equifinality. The approach also allows them to combine additional archaeological information with radiocarbon analyses to get a more accurate population estimate. He and his team applied the approach to existing radiocarbon dates from the Maya city of Tikal, which has extensive prior archaeological research. “It serves as a really good test case,” says Hoggarth, a Maya scholar. For a long time, archaeologists debated two demographic reconstructions: Tikal’s population spiked in the early Classic period and then plateaued, or it spiked in the late Classic period. When the team applied the new Bayesian algorithm, “it showed a really steep population increase associated with the late Classic,” she says, “so that was really wonderful confirmation for us.”
    The authors produced an open-source package that implements the new approach, and website links and code are included in their paper. “The reason I’m excited for this,” Price says, “is that it’s pointing out a mistake that matters, fixing it, and laying the groundwork for future work.”
    This paper is just the first step. Next, through “data fusion,” the team will add ancient DNA and other data to radiocarbon dates for even more reliable demographic reconstructions. “That’s the long-term plan,” Price says. And it could help resolve a second issue with the dates as data approach: a “bias problem” if and when radiocarbon dates are skewed toward a particular time period, leading to inaccurate analyses.
    Story Source:
    Materials provided by Santa Fe Institute. Note: Content may be edited for style and length. More

  • in

    Physicists make square droplets and liquid lattices

    When two substances are brought together, they will eventually settle into a steady state called the thermodynamic equilibrium; in everyday life, we see examples of this when oil floats on top of water and when milk mixes uniformly into coffee. Researchers at Aalto University in Finland wanted to disrupt this sort of state to see what happens — and whether they can control the outcome.
    ‘Things in equilibrium tend to be quite boring,’ says Professor Jaakko Timonen, whose research group carried out new work published in Science Advances on 15 September. ‘It’s fascinating to drive systems out of equilibrium and see if the non-equilibrium structures can be controlled or be useful. Biological life itself is a good example of truly complex behavior in a bunch of molecules that are out of thermodynamic equilibrium.’
    In their work, the team used combinations of oils with different dielectric constants and conductivities. They then subjected the liquids to an electric field.
    ‘When we turn on an electric field over the mixture, electrical charge accumulates at the interface between the oils. This charge density shears the interface out of thermodynamic equilibrium and into interesting formations,’ explains Dr Nikos Kyriakopoulos, one of the authors of the paper. As well as being disrupted by the electric field, the liquids were confined into a thin, nearly two-dimensional sheet. This combination led to the oils reshaping into various completely unexpected droplets and patterns.
    The droplets in the experiment could be made into squares and hexagons with straight sides, which is almost impossible in nature, where small bubbles and droplets tend to form spheres. The two liquids could be also made to form into interconnected lattices: grid patterns that occur regularly in solid materials but are unheard of in liquid mixtures. The liquids can even be coaxed into forming a torus, a donut shape, which was stable and held its shape while the field was applied — unlike in nature, as liquids have a strong tendency to collapse in and fill the hole at the centre. The liquids can also form filaments that roll and rotate around an axis.
    ‘All these strange shapes are caused and sustained by the fact that they are prevented from collapsing back into equilibrium by the motion of the electrical charges building up at the interface,’ says Geet Raju, the first author of the paper.
    One of the exciting results of this work is the ability to create temporary structures with a controlled and well-defined size which can be turned on and off with voltage, an area that the researchers are interested in exploring further for creating voltage-controlled optical devices. Another potential outcome is the ability to create interacting populations of rolling microfilaments and microdroplets that, at some elementary level, mimic the dynamics and collective behaviour of microorganisms like bacteria and microalgae that propel themselves using completely different mechanisms.
    Story Source:
    Materials provided by Aalto University. Note: Content may be edited for style and length. More

  • in

    Using artificial intelligence to predict COVID patients' oxygen needs

    Addenbrooke’s Hospital in Cambridge along with 20 other hospitals from across the world and healthcare technology leader, NVIDIA, have used artificial intelligence (AI) to predict Covid patients’ oxygen needs on a global scale.
    The research was sparked by the pandemic and set out to build an AI tool to predict how much extra oxygen a Covid-19 patient may need in the first days of hospital care, using data from across four continents.
    The technique, known as federated learning, used an algorithm to analyse chest x-rays and electronic health data from hospital patients with Covid symptoms.
    To maintain strict patient confidentiality, the patient data was fully anonymised and an algorithm was sent to each hospital so no data was shared or left its location.
    Once the algorithm had ‘learned’ from the data, the analysis was brought together to build an AI tool which could predict the oxygen needs of hospital Covid patients anywhere in the world.
    Published today in Nature Medicine, the study dubbed EXAM (for EMR CXR AI Model), is one of the largest, most diverse clinical federated learning studies to date. More

  • in

    Scientists develop 'optimal strategies' computer model that could significantly reduce future COVID-19 infections and deaths

    A team of scientists from Nanyang Technological University, Singapore (NTU Singapore) has developed a predictive computer model that, when tested on real pandemic data, proposed strategies that would have reduced the rate of both COVID-19 infections and deaths by an average of 72 per cent, based on a sample from four countries.
    The model, called NSGA-II, could be used to alert local governments in advance on possible surges in COVID-19 infections and mortalities, allowing them time to put forward relevant counter measures more rapidly.
    Through the testing of NSGA-II in four Asian countries using data available from 1 January 2020 to 31 December 2020, the team demonstrated that it could have helped reduce the number of COVID-19 infections and deaths by up to 76 per cent in Japan, 65 per cent in South Korea, 59 per cent in Pakistan, and 89 per cent in Nepal.
    The computer model achieved the result by recommending timely and country-specific advice on the optimal application and duration of COVID-19 interventions, such as home quarantines, social distancing measures, and personal protective measures that would help to thwart the negative impact of the pandemic.
    The team also showed NSGA-II could make predictions on the daily increases of COVID-19 confirmed cases and deaths that were highly accurate, at a confidence level of 95 per cent, compared to the actual cases that took place in the four countries over the past year.
    Harnessing the power of machine learning, the research team developed NSGA-II by inputting large amounts of data on COVID-19 mortalities and infections worldwide that is available for the whole of 2020, helping it learn the dynamics of the pandemic. The research was reported in the peer-reviewed scientific journal Sustainable Cities and Society in August. More

  • in

    New DNA-based chip can be programmed to solve complex math problems

    The field of DNA computing has evolved by leaps and bounds since it was first proposed nearly 30 years ago. But most DNA computing processes are still performed manually, with reactants being added step-by-step to the reaction by hand. Now, finally, scientists at Incheon National University, Korea have found a way to automate DNA calculations by developing a unique chip that can be controlled by a personal computer.
    The term ‘DNA’ immediately calls to mind the double-stranded helix that contains all our genetic information. But the individual units of its two strands are pairs of molecules bonded with each other in a selective, complementary fashion. Turns out, one can take advantage of this pairing property to perform complex mathematical calculations, and this forms the basis of DNA computing.
    Since DNA has only two strands, performing even a simple calculation requires multiple chemical reactions using different sets of DNA. In most existing research, the DNA for each reaction are added manually, one by one, into a single reaction tube, which makes the process very cumbersome. Microfluidic chips, which consist of narrow channels etched onto a material like plastic, offer a way to automate the process. But despite their promise, the use of microfluidic chips for DNA computing remains underexplored.
    In a recent article — made available online in ACS Nano on 7 July 2021 and published in Volume 15 Issue 7 of the journal on 27 July 2021 — a team of scientists from Incheon National University (INU), Korea, present a programmable DNA-based microfluidic chip that can be controlled by a personal computer to perform DNA calculations. “Our hope is that DNA-based CPUs will replace electronic CPUs in the future because they consume less power, which will help with global warming. DNA-based CPUs also provide a platform for complex calculations like deep learning solutions and mathematical modelling,” says Dr. Youngjun Song from INU, who led the study.
    Dr. Song and team used 3D printing to fabricate their microfluidic chip, which can execute Boolean logic, one of the fundamental logics of computer programming. Boolean logic is a type of true-or-false logic that compares inputs and returns a value of ‘true’ or ‘false’ depending on the type of operation, or ‘logic gate,’ used. The logic gate in this experiment consisted of a single-stranded DNA template. Different single-stranded DNA were then used as inputs. If part of an input DNA had a complementary Watson-Crick sequence to the template DNA, it paired to form double-stranded DNA. The output was considered true or false based on the size of the final DNA.
    What makes the designed chip extraordinary is a motor-operated valve system that can be operated using a PC or smartphone. The chip and software set-up together form a microfluidic processing unit (MPU). Thanks to the valve system, the MPU could perform a series of reactions to execute a combination of logic operations in a rapid and convenient manner.
    This unique valve system of the programmable DNA-based MPU paves the way for more complex cascades of reactions that can code for extended functions. “Future research will focus on a total DNA computing solution with DNA algorithms and DNA storage systems,” says Dr. Song.
    Story Source:
    Materials provided by Incheon National University. Note: Content may be edited for style and length. More

  • in

    Finding a metal-oxide needle in a periodic table haystack

    I went to Caltech, and all I got was this T-shirt … and a new way to discover complex and interesting materials.
    Coupling computer automation with an ink-jet printer originally used to print T-shirt designs, researchers at Caltech and Google have developed a high-throughput method of identifying novel materials with interesting properties. In a trial run of the process, they screened hundreds of thousands of possible new materials and discovered one made from cobalt, tantalum, and tin that has tunable transparency and acts as a good catalyst for chemical reactions while remaining stable in strong acid electrolytes.
    The effort, described in a scientific article published in Proceedings of the National Academy of Sciences(PNAS), was led by John Gregoire and Joel Haber of Caltech, and Lusann Yang of Google. It builds on research conducted at the Joint Center for Artificial Photosynthesis (JCAP), a Department of Energy (DOE) Energy Innovation Hub at Caltech, and continues with JCAP’s successor, the Liquid Sunlight Alliance (LiSA), a DOE-funded effort that aims to streamline the complicated steps needed to convert sunlight into fuels, to make that process more efficient.
    Creating new materials is not as simple as dropping a few different elements into a test tube and shaking it up to see what happens. You need the elements that you combine to bond with each other at the atomic level to create something new and different rather than just a heterogeneous mixture of ingredients. With a nearly infinite number of possible combinations of the various squares on the periodic table, the challenge is knowing whichcombinations will yield such a material.
    “Materials discovery can be a bleak process. If you can’t predict where to find the desired properties, you could spend your entire career mixing random elements and never find anything interesting,” says Gregoire, research professor of applied physics and materials science, researcher at JCAP, and LiSA team lead.
    When combining a small number of individual elements, materials scientists can often make predictions about what properties a new material might have based on its constituent parts. However, that process quickly becomes untenable when more complicated mixtures are made. More

  • in

    Engineers create 3D-printed objects that sense how a user is interacting with them

    MIT researchers have developed a new method to 3D print mechanisms that detect how force is being applied to an object. The structures are made from a single piece of material, so they can be rapidly prototyped. A designer could use this method to 3D print “interactive input devices,” like a joystick, switch, or handheld controller, in one go.
    To accomplish this, the researchers integrated electrodes into structures made from metamaterials, which are materials divided into a grid of repeating cells. They also created editing software that helps users build these interactive devices.
    “Metamaterials can support different mechanical functionalities. But if we create a metamaterial door handle, can we also know that the door handle is being rotated, and if so, by how many degrees? If you have special sensing requirements, our work enables you to customize a mechanism to meet your needs,” says co-lead author Jun Gong, a former visiting PhD student at MIT who is now a research scientist at Apple.
    Gong wrote the paper alongside fellow lead authors Olivia Seow, a graduate student in the MIT Department of Electrical Engineering and Computer Science (EECS), and Cedric Honnet, a research assistant in the MIT Media Lab. Other co-authors are MIT graduate student Jack Forman and senior author Stefanie Mueller, who is an associate professor in EECS and a member of the Computer Science and Artificial Intelligence Laboratory (CSAIL). The research will be presented at the Association for Computing Machinery Symposium on User Interface Software and Technology next month.
    “What I find most exciting about the project is the capability to integrate sensing directly into the material structure of objects. This will enable new intelligent environments in which our objects can sense each interaction with them,” Mueller says. “For instance, a chair or couch made from our smart material could detect the user’s body when the user sits on it and either use it to query particular functions (such as turning on the light or TV) or to collect data for later analysis (such as detecting and correcting body posture).”
    Embedded electrodes
    Because metamaterials are made from a grid of cells, when the user applies force to a metamaterial object, some of the flexible, interior cells stretch or compress. More

  • in

    Scientists can now assemble entire genomes on their personal computers in minutes

    Scientists at the Massachusetts Institute of Technology (MIT) and the Institut Pasteur in France have developed a technique for reconstructing whole genomes, including the human genome, on a personal computer. This technique is about a hundred times faster than current state-of-the-art approaches and uses one-fifth the resources. The study, published September 14 in the journal Cell Systems, allows for a more compact representation of genome data inspired by the way in which words, rather than letters, offer condensed building blocks for language models.
    “We can quickly assemble entire genomes and metagenomes, including microbial genomes, on a modest laptop computer,” says Bonnie Berger (@lab_berger), the Simons Professor of Mathematics at the Computer Science and AI Lab at MIT and an author of the study. “This ability is essential in assessing changes in the gut microbiome linked to disease and bacterial infections, such as sepsis, so that we can more rapidly treat them and save lives.”
    Genome assembly projects have come a long way since the Human Genome Project, which finished assembling the first complete human genome in 2003 for the cost of about $2.7 billion and more than a decade of international collaboration. But while human genome assembly projects no longer take years, they still require several days and massive computer power. Third-generation sequencing technologies offer terabytes of high-quality genomic sequences with tens of thousands of base pairs, yet genome assembly using such an immense quantity of data has proved challenging.
    To approach genome assembly more efficiently than current techniques, which involve making pairwise comparisons between all possible pairs of reads, Berger and colleagues turned to language models. Building from the concept of a de Bruijn graph, a simple, efficient data structure used for genome assembly, the researchers developed a minimizer-space de Bruin graph (mdBG), which uses short sequences of nucleotides called minimizers instead of single nucleotides.
    “Our minimizer-space de Bruijn graphs store only a small fraction of the total nucleotides, while preserving the overall genome structure, enabling them to be orders of magnitude more efficient than classical de Bruijn graphs,” says Berger.
    The researchers applied their method to assemble real HiFi data (which has almost perfect single-molecule read accuracy) for Drosophila melanogaster fruit flies, as well as human genome data provided by Pacific Biosciences (PacBio). When they evaluated the resulting genomes, Berger and colleagues found that their mdBG-based software required about 33 times less time and 8 times less random-access memory (RAM) computing hardware than other genome assemblers. Their software performed genome assembly for the HiFi human data 81 times faster with 18 times less memory usage than the Peregrine assembler and 338 times faster with 19 times less memory usage than the hifiasm assembler. More