More stories

  • in

    Adding sound to quantum simulations

    When sound was first incorporated into movies in the 1920s, it opened up new possibilities for filmmakers such as music and spoken dialogue. Physicists may be on the verge of a similar revolution, thanks to a new device developed at Stanford University that promises to bring an audio dimension to previously silent quantum science experiments.
    In particular, it could bring sound to a common quantum science setup known as an optical lattice, which uses a crisscrossing mesh of laser beams to arrange atoms in an orderly manner resembling a crystal. This tool is commonly used to study the fundamental characteristics of solids and other phases of matter that have repeating geometries. A shortcoming of these lattices, however, is that they are silent.
    “Without sound or vibration, we miss a crucial degree of freedom that exists in real materials,” said Benjamin Lev, associate professor of applied physics and of physics, who set his sights on this issue when he first came to Stanford in 2011. “It’s like making soup and forgetting the salt; it really takes the flavor out of the quantum ‘soup.'”
    After a decade of engineering and benchmarking, Lev and collaborators from Pennsylvania State University and the University of St. Andrews have produced the first optical lattice of atoms that incorporates sound. The research was published Nov. 11 in Nature. By designing a very precise cavity that held the lattice between two highly reflective mirrors, the researchers made it so the atoms could “see” themselves repeated thousands of times via particles of light, or photons, that bounce back and forth between the mirrors. This feedback causes the photons to behave like phonons — the building blocks of sound.
    “If it were possible to put your ear to the optical lattice of atoms, you would hear their vibration at around 1 kHz,” said Lev.
    A supersolid with sound
    Previous optical lattice experiments were silent affairs because they lacked the special elasticity of this new system. Lev, young graduate student Sarang Gopalakrishnan — now an assistant professor of physics at Penn State and co-author of the paper — and Paul Goldbart (now provost of Stony Brook University) came up with the foundational theory for this system. But it took collaboration with Jonathan Keeling — a reader at the University of St. Andrews and co-author of the paper — and years of work to build the corresponding device. More

  • in

    Nuclear radiation used to transmit digital data wirelessly

    Engineers have successfully transferred digitally encoded information wirelessly using nuclear radiation instead of conventional technology.
    Radio waves and mobile phone signals relies on electromagnetic radiation for communication but in a new development, engineers from Lancaster University in the UK, working with the Jožef Stefan Institute in Slovenia, transferred digitally encoded information using “fast neutrons” instead.
    The researchers measured the spontaneous emission of fast neutrons from californium-252, a radioactive isotope produced in nuclear reactors.
    Modulated emissions were measured using a detector and recorded on a laptop.
    Several examples of information, i.e., a word, the alphabet and a random number selected blindly, were encoded serially into the modulation of the neutron field and the output decoded on a laptop which recovered the encoded information on screen.
    A double-blind test was performed in which a number derived from a random number generator was encoded without prior knowledge of those uploading it, and then transmitted and decoded.
    All transmission tests attempted proved to be 100% successful.
    Professor Malcolm Joyce of Lancaster University said: “We demonstrate the potential of fast neutron radiation as a medium for wireless communications for applications where conventional electromagnetic transmission is either not feasible or is inherently limited.”
    He said fast neutrons have an advantage over conventional electromagnetic waves, which are significantly weakened by transmission through materials including metals.
    “In some safety-critical scenarios, such as concerning the integrity of reactor containments, and metal vaults and bulkheads in maritime structures, it can be important to minimise the number of penetrations made through such metal structures for communications cabling. The use of neutrons for information transmission through such structures could negate the need for such penetrations and is perhaps also relevant to scenarios where limited transmissions are desirable in difficult circumstances, such as for emergency rescue operations.”
    Fast neutrons could also be incorporated into a mixed-signal, electronic systems to achieve signal mixing being between electrons and neutrons. This could contribute to the requirement to ensure the integrity of information transfer.
    Story Source:
    Materials provided by Lancaster University. Note: Content may be edited for style and length. More

  • in

    New computer model is a key step toward low-temperature preservation of 3D tissues, organs

    Medical science is a key step closer to the cryopreservation of brain slices used in neurological research, pancreatic cells for the treatment of diabetes and even whole organs thanks to a new computer model that predicts how tissue’s size will change during the preservation process.
    Findings of the study led by Adam Higgins of the Oregon State University College of Engineering were published in Biophysical Journal.
    “Cryopreservation of tissues would be useful for biomedical research and for transplantation medicine, but it’s difficult to cryopreserve tissues for various reasons,” said Higgins, associate professor of bioengineering. “A major reason is that formation of ice can break apart a tissue from the inside. Folks who cook are probably already familiar with this — a tomato that has been frozen and thawed becomes mushy.”
    Cryopreservation has long been widely used in comparatively simpler applications such as preserving semen, blood, embryos and plant seeds. A barrier to other uses has been damage from ice crystallization and the harmful nature of the compounds added to prevent ice formation.
    Vitrification, Higgins explains, is a cryopreservation strategy that thwarts ice crystal damage through chemicals known as cryoprotectants, or CPAs, that can keep ice from forming. An example of a CPA is ethylene glycol, used in automobile antifreeze.
    In tissues, a high enough concentration of CPAs causes a solid “glass” to form rather than ice crystals when tissue temperature is reduced to liquid nitrogen levels; liquid nitrogen boils at minus-320 degrees Fahrenheit. More

  • in

    Identifying individual proteins using nanopores and supercomputers

    The amount and types of proteins our cells produce tell us important details about our health and how our bodies work. But the methods we have of identifying and quantifying individual proteins are inadequate to the task. Not only is the diversity of proteins unknown, but often, amino acids are changed after synthesis through post-translational modifications.
    In recent years, much progress has been made in DNA reading using nanopores — minute membranes large enough to let an unspooled DNA strand through, but just barely. By carefully measuring the ionic voltage of the nanopore as DNA crosses over, biologists have been able to rapidly identify the order of base pairs in the sequence. In fact, this year, nanopores were used to finally sequence the entire human genome — something that was not previously possible with other technologies.
    In new research out in Science magazine, researchers from Delft University of Technology in the Netherlands and the University of Illinois at Urbana-Champaign (UIUC) in the U.S. have extended these DNA nanopore successes and provided a proof-of-concept that the same method is possible for single protein identification, characterizing proteins with single-amino-acid resolution and a vanishingly small (10^-6 or 1 in a million) margins of error.
    “This nanopore peptide reader provides site-specific information about the peptide’s primary sequence that may find applications in single-molecule protein fingerprinting and variant identification,” the authors wrote.
    The workhorses of our cells, proteins are a long peptide strings made of 20 different types of amino acids. The researchers utilized an enzyme called helicase Hel308 that can attach to DNA-peptide hybrids and pull them, in a controlled way, through a biological nanopore known as MspA (mycobacterium smegmatis porin A). They chose the Hel308 DNA helicase because it can pull peptides through the pore in half-nucleotide observable steps, which correspond closely to single amino acids.
    Each step through the narrow gate theoretically produces a unique current signal as the amino acid partially blocks an electrical current carried by ions through the nanopore. More

  • in

    Data available for training AI to spot skin cancer are insufficient and lacking in pictures of darker skin

    The images and accompanying data available for training artificial intelligence (AI) to spot skin cancer are insufficient and include very few images of darker skin, according to research presented at the NCRI Festival and published in Lancet Digital Health. 
    AI is increasingly being used in medicine as it can make diagnosis of diseases like skin cancer quicker and more effective. However, AI needs to be ‘trained’ by looking at data and images from a large number of patients where the diagnosis has already been established and so an AI program depends heavily upon the information it is trained on.
    Researchers say there is an urgent need for better sets of data on skin cancers and other skin lesions which contain information on who is represented in the datasets.
    The research was presented by Dr David Wen from the University of Oxford, UK. He said: “AI programs hold a lot of potential for diagnosing skin cancer because it can look at pictures and quickly and cost-effectively evaluate any worrying spots on the skin. However, it’s important to know about the images and patients used to develop programs, as these influence which groups of people the programs will be most effective for in real-life settings. Research has shown that programs trained on images taken from people with lighter skin types only might not be as accurate for people with darker skin, and vice versa.”
    Dr Wen and his colleagues carried out the first ever review of all freely accessible sets of data on skin lesions around the world. They found 21 sets including more than 100,000 pictures.
    Diagnosis of skin cancer normally requires a photo of the worrying lesion as well as a picture taken with a special hand-held magnifier, called a dermatoscope, but only two of the 21 datasets included images taken with both of these methods. The datasets were also missing other important information, such as how images were chosen to be included, and evidence of ethical approval or patient consent. More

  • in

    Engineers design autonomous robot that can open doors, find wall outlet to recharge

    One flaw in the notion that robots will take over the world is that the world is full of doors.
    And doors are kryptonite to robots, said Ou Ma, an aerospace engineering professor at the University of Cincinnati.
    “Robots can do many things, but if you want one to open a door by itself and go through the doorway, that’s a tremendous challenge,” Ma said.
    Students in UC’s Intelligent Robotics and Autonomous Systems Laboratory have solved this complex problem in three-dimensional digital simulations. Now they’re building an autonomous robot that not only can open its own doors but also can find the nearest electric wall outlet to recharge without human assistance.
    This simple advance in independence represents a huge leap forward for helper robots that vacuum and disinfect office buildings, airports and hospitals. Helper robots are part of a $27 billion robotics industry, which includes manufacturing and automation.
    The study was published in the journal IEEE Access.
    UC College of Engineering and Applied Science doctoral student Yufeng Sun, the study’s lead author, said some researchers have addressed the problem by scanning an entire room to create a 3D digital model so the robot can locate a door. But that is a time-consuming custom solution that works only for the particular room that is scanned. More

  • in

    Global river database documents 40 years of change

    A first-ever database compiling movement of the largest rivers in the world over time could become a crucial tool for urban planners to better understand the deltas that are home to these rivers and a large portion of Earth’s population.
    The database, created by researchers at The University of Texas at Austin, uses publicly available remote sensing data to show how the river centerlines of the world’s 48 most threatened deltas have moved during the past 40 years. The data can be used to predict how rivers will continue to move over time and help governments manage population density and future development.
    “When we think about river management strategies, we have very little to no information about how rivers are moving over time,” said Paola Passalacqua, an associate professor in the Cockrell School of Engineering’s Department of Civil, Architectural and Environmental Engineering who leads the ongoing river analysis research.
    The research was published today in Proceedings of the National Academy of Sciences.
    The database includes three U.S. rivers, the Mississippi, the Colorado and the Rio Grande. Although some areas of these deltas are experiencing migration, overall, they are mostly stable, the data show. Aggressive containment strategies to keep those rivers in their place, especially near population centers, play a role in that, Passalacqua said.
    Average migration rates for each river delta help identify which areas are stable and which are experiencing major river shifts. The researchers also published more extensive data online that includes information about how different segments of rivers have moved over time. It could help planners see what’s going in rural areas vs. urban areas when making decisions about how to manage the rivers and what to do with development.
    The researchers leaned on techniques from a variety of disciplines to compile the data and published their methods online. Machine learning and image processing software helped them examine decades’ worth of images. The researchers worked with Alan Bovik of the Department of Electrical and Computer Engineering and doctoral student Leo Isikdogan to develop that technology. They also borrowed from fluid mechanics, using tools designed to monitor water particles in turbulence experiments to instead track changes to river locations over the years.
    “We got the idea to use tools from fluid mechanics while attending a weekly department seminar where other researchers at the university share their work,” said Tess Jarriel, a graduate research assistant in Passalacqua’s lab and lead author of the paper. “It just goes to show how important it is to collaborate across disciplines.”
    Rivers that have high sediment flux and flood frequency move more as it is in their nature and part of an important tradeoff that underpins Passalacqua’s research.
    By knowing more about these river deltas where millions of people live, planners can have a better idea of how best to balance these tradeoffs. Passalacqua, as well as researchers in her lab, have recently published research about these tradeoffs between the need for river freedom and humanity’s desire for stability.
    Passalacqua has been working on this topic for more than eight years. The team and collaborators are in the process of publishing another paper as part of this work that expands beyond the centerlines of rivers and will also look at riverbanks. That additional information will give an even clearer picture about river movement over time, with more nuance, because sides of the river can move in different directions and at different speeds.
    The research was funded through Passalacqua’s National Science Foundation CAREER award; grants from the NSF’s Ocean Sciences and Earth Sciences divisions; and Planet Texas 2050, a UT Austin initiative to support research to make communities more resilient. Co-authors on the paper are Jarriel and postdoctoral researcher John Swartz.
    Story Source:
    Materials provided by University of Texas at Austin. Note: Content may be edited for style and length. More

  • in

    Hidden behavior of supercapacitor materials

    Researchers from the University of Surrey’s Advanced Technology Institute (ATI) and the University of São Paulo have developed a new analysis technique that will help scientists improve renewable energy storage by making better supercapacitors. The team’s new approach enables researchers to investigate the complex inter-connected behaviour of supercapacitor electrodes made from layers of different materials.
    Improvements in energy storage are vital if countries are to deliver carbon reduction targets. The inherent unpredictability of energy from solar and wind means effective storage is required to ensure consistency in supply, and supercapacitors are seen as an important part of the solution.
    Supercapacitors could also be the answer to charging electric vehicles much faster than is possible using lithium-ion batteries. However, more supercapacitor development is needed to enable them to effectively store enough electricity.
    Surrey’s peer-reviewed paper, published in Electrochimica Acta, explains how the research team used a cheap polymer material called Polyaniline (PANI), which stores energy through a mechanism known as pseudocapacitance. PANI is conductive and can be used as the electrode in a supercapacitor device, storing charge by trapping ions. To maximise energy storage, the researchers have developed a novel method of depositing a thin layer of PANI onto a forest of conductive carbon nanotubes. This composite material makes an excellent supercapacitive electrode, but the fact that it is made up of different materials makes it difficult to separate and fully understand the complex processes which occur during charging and discharging. This is a problem across the field of pseudocapacitor development.
    To tackle this problem, the researchers adopted a technique known as the Distribution of Relaxation Times. This analysis method allows scientists to examine complex electrode processes to separate and identify them, making it possible to optimise fabrication methods to maximise useful reactions and reduce reactions that damage the electrode. The technique can also be applied to researchers using different materials in supercapacitor and pseudocapacitor development.
    Ash Stott, a postgraduate research student at the University of Surrey who was the lead scientist on the project, said:
    “The future of global energy use will depend on consumers and industry generating, storing and using energy more efficiently, and supercapacitors will be one of the leading technologies for intermittent storage, energy harvesting and high-power delivery. Our work will help make that happen more effectively.”
    Professor Ravi Silva, Director of the ATI and principal author, said:
    “Following on from world leaders pledging their support for green energy at COP26, our work shows researchers how to accelerate the development of high-performance materials for use as energy storage elements, a key component of solar or wind energy systems. This research brings us one step closer to a clean, cost-effective energy future.”
    Story Source:
    Materials provided by University of Surrey. Note: Content may be edited for style and length. More