More stories

  • in

    When algorithms get creative

    Our brains are incredibly adaptive. Every day, we form new memories, acquire new knowledge, or refine existing skills. This stands in marked contrast to our current computers, which typically only perform pre-programmed actions. At the core of our adaptability lies synaptic plasticity. Synapses are the connection points between neurons, which can change in different ways depending on how they are used. This synaptic plasticity is an important research topic in neuroscience, as it is central to learning processes and memory. To better understand these brain processes and build adaptive machines, researchers in the fields of neuroscience and artificial intelligence (AI) are creating models for the mechanisms underlying these processes. Such models for learning and plasticity help to understand biological information processing and should also enable machines to learn faster.
    Algorithms mimic biological evolution
    Working in the European Human Brain Project, researchers at the Institute of Physiology at the University of Bern have now developed a new approach based on so-called evolutionary algorithms. These computer programs search for solutions to problems by mimicking the process of biological evolution, such as the concept of natural selection. Thus, biological fitness, which describes the degree to which an organism adapts to its environment, becomes a model for evolutionary algorithms. In such algorithms, the “fitness” of a candidate solution is how well it solves the underlying problem.
    Amazing creativity
    The newly developed approach is referred to as the “evolving-to-learn” (E2L) approach or “becoming adaptive.” The research team led by Dr. Mihai Petrovici of the Institute of Physiology at the University of Bern and Kirchhoff Institute for Physics at the University of Heidelberg, confronted the evolutionary algorithms with three typical learning scenarios. In the first, the computer had to detect a repeating pattern in a continuous stream of input without receiving feedback about its performance. In the second scenario, the computer received virtual rewards when behaving in a particular desired manner. Finally, in the third scenario of “guided learning,” the computer was precisely told how much its behavior deviated from the desired one.
    “In all these scenarios, the evolutionary algorithms were able to discover mechanisms of synaptic plasticity, and thereby successfully solved a new task,” says Dr. Jakob Jordan, corresponding and co-first author from the Institute of Physiology at the University of Bern. In doing so, the algorithms showed amazing creativity: “For example, the algorithm found a new plasticity model in which signals we defined are combined to form a new signal. In fact, we observe that networks using this new signal learn faster than with previously known rules,” emphasizes Dr. Maximilian Schmidt from the RIKEN Center for Brain Science in Tokyo, co-first author of the study. The results were published in the journal eLife.
    “We see E2L as a promising approach to gain deep insights into biological learning principles and accelerate progress towards powerful artificial learning machines,” says Mihai Petrovoci. “We hope it will accelerate the research on synaptic plasticity in the nervous system,” concludes Jakob Jordan. The findings will provide new insights into how healthy and diseased brains work. They may also pave the way for the development of intelligent machines that can better adapt to the needs of their users.
    Story Source:
    Materials provided by University of Bern. Note: Content may be edited for style and length. More

  • in

    Adding sound to quantum simulations

    When sound was first incorporated into movies in the 1920s, it opened up new possibilities for filmmakers such as music and spoken dialogue. Physicists may be on the verge of a similar revolution, thanks to a new device developed at Stanford University that promises to bring an audio dimension to previously silent quantum science experiments.
    In particular, it could bring sound to a common quantum science setup known as an optical lattice, which uses a crisscrossing mesh of laser beams to arrange atoms in an orderly manner resembling a crystal. This tool is commonly used to study the fundamental characteristics of solids and other phases of matter that have repeating geometries. A shortcoming of these lattices, however, is that they are silent.
    “Without sound or vibration, we miss a crucial degree of freedom that exists in real materials,” said Benjamin Lev, associate professor of applied physics and of physics, who set his sights on this issue when he first came to Stanford in 2011. “It’s like making soup and forgetting the salt; it really takes the flavor out of the quantum ‘soup.'”
    After a decade of engineering and benchmarking, Lev and collaborators from Pennsylvania State University and the University of St. Andrews have produced the first optical lattice of atoms that incorporates sound. The research was published Nov. 11 in Nature. By designing a very precise cavity that held the lattice between two highly reflective mirrors, the researchers made it so the atoms could “see” themselves repeated thousands of times via particles of light, or photons, that bounce back and forth between the mirrors. This feedback causes the photons to behave like phonons — the building blocks of sound.
    “If it were possible to put your ear to the optical lattice of atoms, you would hear their vibration at around 1 kHz,” said Lev.
    A supersolid with sound
    Previous optical lattice experiments were silent affairs because they lacked the special elasticity of this new system. Lev, young graduate student Sarang Gopalakrishnan — now an assistant professor of physics at Penn State and co-author of the paper — and Paul Goldbart (now provost of Stony Brook University) came up with the foundational theory for this system. But it took collaboration with Jonathan Keeling — a reader at the University of St. Andrews and co-author of the paper — and years of work to build the corresponding device. More

  • in

    Nuclear radiation used to transmit digital data wirelessly

    Engineers have successfully transferred digitally encoded information wirelessly using nuclear radiation instead of conventional technology.
    Radio waves and mobile phone signals relies on electromagnetic radiation for communication but in a new development, engineers from Lancaster University in the UK, working with the Jožef Stefan Institute in Slovenia, transferred digitally encoded information using “fast neutrons” instead.
    The researchers measured the spontaneous emission of fast neutrons from californium-252, a radioactive isotope produced in nuclear reactors.
    Modulated emissions were measured using a detector and recorded on a laptop.
    Several examples of information, i.e., a word, the alphabet and a random number selected blindly, were encoded serially into the modulation of the neutron field and the output decoded on a laptop which recovered the encoded information on screen.
    A double-blind test was performed in which a number derived from a random number generator was encoded without prior knowledge of those uploading it, and then transmitted and decoded.
    All transmission tests attempted proved to be 100% successful.
    Professor Malcolm Joyce of Lancaster University said: “We demonstrate the potential of fast neutron radiation as a medium for wireless communications for applications where conventional electromagnetic transmission is either not feasible or is inherently limited.”
    He said fast neutrons have an advantage over conventional electromagnetic waves, which are significantly weakened by transmission through materials including metals.
    “In some safety-critical scenarios, such as concerning the integrity of reactor containments, and metal vaults and bulkheads in maritime structures, it can be important to minimise the number of penetrations made through such metal structures for communications cabling. The use of neutrons for information transmission through such structures could negate the need for such penetrations and is perhaps also relevant to scenarios where limited transmissions are desirable in difficult circumstances, such as for emergency rescue operations.”
    Fast neutrons could also be incorporated into a mixed-signal, electronic systems to achieve signal mixing being between electrons and neutrons. This could contribute to the requirement to ensure the integrity of information transfer.
    Story Source:
    Materials provided by Lancaster University. Note: Content may be edited for style and length. More

  • in

    New computer model is a key step toward low-temperature preservation of 3D tissues, organs

    Medical science is a key step closer to the cryopreservation of brain slices used in neurological research, pancreatic cells for the treatment of diabetes and even whole organs thanks to a new computer model that predicts how tissue’s size will change during the preservation process.
    Findings of the study led by Adam Higgins of the Oregon State University College of Engineering were published in Biophysical Journal.
    “Cryopreservation of tissues would be useful for biomedical research and for transplantation medicine, but it’s difficult to cryopreserve tissues for various reasons,” said Higgins, associate professor of bioengineering. “A major reason is that formation of ice can break apart a tissue from the inside. Folks who cook are probably already familiar with this — a tomato that has been frozen and thawed becomes mushy.”
    Cryopreservation has long been widely used in comparatively simpler applications such as preserving semen, blood, embryos and plant seeds. A barrier to other uses has been damage from ice crystallization and the harmful nature of the compounds added to prevent ice formation.
    Vitrification, Higgins explains, is a cryopreservation strategy that thwarts ice crystal damage through chemicals known as cryoprotectants, or CPAs, that can keep ice from forming. An example of a CPA is ethylene glycol, used in automobile antifreeze.
    In tissues, a high enough concentration of CPAs causes a solid “glass” to form rather than ice crystals when tissue temperature is reduced to liquid nitrogen levels; liquid nitrogen boils at minus-320 degrees Fahrenheit. More

  • in

    Identifying individual proteins using nanopores and supercomputers

    The amount and types of proteins our cells produce tell us important details about our health and how our bodies work. But the methods we have of identifying and quantifying individual proteins are inadequate to the task. Not only is the diversity of proteins unknown, but often, amino acids are changed after synthesis through post-translational modifications.
    In recent years, much progress has been made in DNA reading using nanopores — minute membranes large enough to let an unspooled DNA strand through, but just barely. By carefully measuring the ionic voltage of the nanopore as DNA crosses over, biologists have been able to rapidly identify the order of base pairs in the sequence. In fact, this year, nanopores were used to finally sequence the entire human genome — something that was not previously possible with other technologies.
    In new research out in Science magazine, researchers from Delft University of Technology in the Netherlands and the University of Illinois at Urbana-Champaign (UIUC) in the U.S. have extended these DNA nanopore successes and provided a proof-of-concept that the same method is possible for single protein identification, characterizing proteins with single-amino-acid resolution and a vanishingly small (10^-6 or 1 in a million) margins of error.
    “This nanopore peptide reader provides site-specific information about the peptide’s primary sequence that may find applications in single-molecule protein fingerprinting and variant identification,” the authors wrote.
    The workhorses of our cells, proteins are a long peptide strings made of 20 different types of amino acids. The researchers utilized an enzyme called helicase Hel308 that can attach to DNA-peptide hybrids and pull them, in a controlled way, through a biological nanopore known as MspA (mycobacterium smegmatis porin A). They chose the Hel308 DNA helicase because it can pull peptides through the pore in half-nucleotide observable steps, which correspond closely to single amino acids.
    Each step through the narrow gate theoretically produces a unique current signal as the amino acid partially blocks an electrical current carried by ions through the nanopore. More

  • in

    Data available for training AI to spot skin cancer are insufficient and lacking in pictures of darker skin

    The images and accompanying data available for training artificial intelligence (AI) to spot skin cancer are insufficient and include very few images of darker skin, according to research presented at the NCRI Festival and published in Lancet Digital Health. 
    AI is increasingly being used in medicine as it can make diagnosis of diseases like skin cancer quicker and more effective. However, AI needs to be ‘trained’ by looking at data and images from a large number of patients where the diagnosis has already been established and so an AI program depends heavily upon the information it is trained on.
    Researchers say there is an urgent need for better sets of data on skin cancers and other skin lesions which contain information on who is represented in the datasets.
    The research was presented by Dr David Wen from the University of Oxford, UK. He said: “AI programs hold a lot of potential for diagnosing skin cancer because it can look at pictures and quickly and cost-effectively evaluate any worrying spots on the skin. However, it’s important to know about the images and patients used to develop programs, as these influence which groups of people the programs will be most effective for in real-life settings. Research has shown that programs trained on images taken from people with lighter skin types only might not be as accurate for people with darker skin, and vice versa.”
    Dr Wen and his colleagues carried out the first ever review of all freely accessible sets of data on skin lesions around the world. They found 21 sets including more than 100,000 pictures.
    Diagnosis of skin cancer normally requires a photo of the worrying lesion as well as a picture taken with a special hand-held magnifier, called a dermatoscope, but only two of the 21 datasets included images taken with both of these methods. The datasets were also missing other important information, such as how images were chosen to be included, and evidence of ethical approval or patient consent. More

  • in

    Engineers design autonomous robot that can open doors, find wall outlet to recharge

    One flaw in the notion that robots will take over the world is that the world is full of doors.
    And doors are kryptonite to robots, said Ou Ma, an aerospace engineering professor at the University of Cincinnati.
    “Robots can do many things, but if you want one to open a door by itself and go through the doorway, that’s a tremendous challenge,” Ma said.
    Students in UC’s Intelligent Robotics and Autonomous Systems Laboratory have solved this complex problem in three-dimensional digital simulations. Now they’re building an autonomous robot that not only can open its own doors but also can find the nearest electric wall outlet to recharge without human assistance.
    This simple advance in independence represents a huge leap forward for helper robots that vacuum and disinfect office buildings, airports and hospitals. Helper robots are part of a $27 billion robotics industry, which includes manufacturing and automation.
    The study was published in the journal IEEE Access.
    UC College of Engineering and Applied Science doctoral student Yufeng Sun, the study’s lead author, said some researchers have addressed the problem by scanning an entire room to create a 3D digital model so the robot can locate a door. But that is a time-consuming custom solution that works only for the particular room that is scanned. More

  • in

    Global river database documents 40 years of change

    A first-ever database compiling movement of the largest rivers in the world over time could become a crucial tool for urban planners to better understand the deltas that are home to these rivers and a large portion of Earth’s population.
    The database, created by researchers at The University of Texas at Austin, uses publicly available remote sensing data to show how the river centerlines of the world’s 48 most threatened deltas have moved during the past 40 years. The data can be used to predict how rivers will continue to move over time and help governments manage population density and future development.
    “When we think about river management strategies, we have very little to no information about how rivers are moving over time,” said Paola Passalacqua, an associate professor in the Cockrell School of Engineering’s Department of Civil, Architectural and Environmental Engineering who leads the ongoing river analysis research.
    The research was published today in Proceedings of the National Academy of Sciences.
    The database includes three U.S. rivers, the Mississippi, the Colorado and the Rio Grande. Although some areas of these deltas are experiencing migration, overall, they are mostly stable, the data show. Aggressive containment strategies to keep those rivers in their place, especially near population centers, play a role in that, Passalacqua said.
    Average migration rates for each river delta help identify which areas are stable and which are experiencing major river shifts. The researchers also published more extensive data online that includes information about how different segments of rivers have moved over time. It could help planners see what’s going in rural areas vs. urban areas when making decisions about how to manage the rivers and what to do with development.
    The researchers leaned on techniques from a variety of disciplines to compile the data and published their methods online. Machine learning and image processing software helped them examine decades’ worth of images. The researchers worked with Alan Bovik of the Department of Electrical and Computer Engineering and doctoral student Leo Isikdogan to develop that technology. They also borrowed from fluid mechanics, using tools designed to monitor water particles in turbulence experiments to instead track changes to river locations over the years.
    “We got the idea to use tools from fluid mechanics while attending a weekly department seminar where other researchers at the university share their work,” said Tess Jarriel, a graduate research assistant in Passalacqua’s lab and lead author of the paper. “It just goes to show how important it is to collaborate across disciplines.”
    Rivers that have high sediment flux and flood frequency move more as it is in their nature and part of an important tradeoff that underpins Passalacqua’s research.
    By knowing more about these river deltas where millions of people live, planners can have a better idea of how best to balance these tradeoffs. Passalacqua, as well as researchers in her lab, have recently published research about these tradeoffs between the need for river freedom and humanity’s desire for stability.
    Passalacqua has been working on this topic for more than eight years. The team and collaborators are in the process of publishing another paper as part of this work that expands beyond the centerlines of rivers and will also look at riverbanks. That additional information will give an even clearer picture about river movement over time, with more nuance, because sides of the river can move in different directions and at different speeds.
    The research was funded through Passalacqua’s National Science Foundation CAREER award; grants from the NSF’s Ocean Sciences and Earth Sciences divisions; and Planet Texas 2050, a UT Austin initiative to support research to make communities more resilient. Co-authors on the paper are Jarriel and postdoctoral researcher John Swartz.
    Story Source:
    Materials provided by University of Texas at Austin. Note: Content may be edited for style and length. More