More stories

  • in

    Software spots and fixes hang bugs in seconds, rather than weeks

    Hang bugs — when software gets stuck, but doesn’t crash — can frustrate both users and programmers, taking weeks for companies to identify and fix. Now researchers from North Carolina State University have developed software that can spot and fix the problems in seconds.
    “Many of us have experience with hang bugs — think of a time when you were on website and the wheel just kept spinning and spinning,” says Helen Gu, co-author of a paper on the work and a professor of computer science at NC State. “Because these bugs don’t crash the program, they’re hard to detect. But they can frustrate or drive away customers and hurt a company’s bottom line.”
    With that in mind, Gu and her collaborators developed an automated program, called HangFix, that can detect hang bugs, diagnose the relevant problem, and apply a patch that corrects the root cause of the error. Video of Gu discussing the program can be found here.
    The researchers tested a prototype of HangFix against 42 real-world hang bugs in 10 commonly used cloud server applications. The bugs were drawn from a database of hang bugs that programmers discovered affecting various websites. HangFix fixed 40 of the bugs in seconds.
    “The remaining two bugs were identified and partially fixed, but required additional input from programmers who had relevant domain knowledge of the application,” Gu says.
    For comparison, it took weeks or months to detect, diagnose and fix those hang bugs when they were first discovered.
    “We’re optimistic that this tool will make hang bugs less common — and websites less frustrating for many users,” Gu says. “We are working to integrate Hangfix into InsightFinder.” InsightFinder is the AI-based IT operations and analytics startup founded by Gu.
    The paper, “HangFix: Automatically Fixing Software Hang Bugs for Production Cloud Systems,” is being presented at the ACM Symposium on Cloud Computing (SoCC’20), being held online Oct. 19-21. The paper was co-authored by Jingzhu He, a Ph.D. student at NC State who is nearing graduation; Ting Dai, a Ph.D. graduate of NC State who is now at IBM Research; and Guoliang Jin, an assistant professor of computer science at NC State.
    The work was done with support from the National Science Foundation under grants 1513942 and 1149445.
    HangFix is the latest in a long line of tools Gu’s team has developed to address cloud computing challenges. Her 2011 paper, “CloudScale: Elastic Resource Scaling for Multi-tenant Cloud Systems,” was selected as the winner of the 2020 SoCC 10-Year Award at this year’s conference.

    Story Source:
    Materials provided by North Carolina State University. Note: Content may be edited for style and length. More

  • in

    Using robotic assistance to make colonoscopy kinder and easier

    Scientists have made a breakthrough in their work to develop semi-autonomous colonoscopy, using a robot to guide a medical device into the body.
    The milestone brings closer the prospect of an intelligent robotic system being able to guide instruments to precise locations in the body to take biopsies or allow internal tissues to be examined.
    A doctor or nurse would still be on hand to make clinical decisions but the demanding task of manipulating the device is offloaded to a robotic system.
    The latest findings — ‘Enabling the future of colonoscopy with intelligent and autonomous magnetic manipulation’ — is the culmination of 12 years of research by an international team of scientists led by the University of Leeds.
    The research is published today (Monday, 12 October) in the scientific journal Nature Machine Intelligence. 
    Patient trials using the system could begin next year or in early 2022.

    advertisement

    Pietro Valdastri, Professor of Robotics and Autonomous Systems at Leeds, is supervising the research. He said: “Colonoscopy gives doctors a window into the world hidden deep inside the human body and it provides a vital role in the screening of diseases such as colorectal cancer. But the technology has remained relatively unchanged for decades.
    “What we have developed is a system that is easier for doctors or nurses to operate and is less painful for patients. It marks an important a step in the move to make colonoscopy much more widely available — essential if colorectal cancer is to be identified early.”
    Because the system is easier to use, the scientists hope this can increase the number of providers who can perform the procedure and allow for greater patient access to colonoscopy.
    A colonoscopy is a procedure to examine the rectum and colon. Conventional colonoscopy is carried out using a semi-flexible tube which is inserted into the anus, a process some patients find so painful they require an anaesthetic.
    Magnetic flexible colonoscope
    The research team has developed a smaller, capsule-shaped device which is tethered to a narrow cable and is inserted into the anus and then guided into place — not by the doctor or nurse pushing the colonoscope but by a magnet on a robotic arm positioned over the patient.

    advertisement

    The robotic arm moves around the patient as it manoeuvres the capsule. The system is based on the principle that magnetic forces attract and repel.
    The magnet on the outside of the patient interacts with tiny magnets in the capsule inside the body, navigating it through the colon. The researchers say it will be less painful than having a conventional colonoscopy.
    Guiding the robotic arm can be done manually but it is a technique that is difficult to master. In response, the researchers have developed different levels of robotic assistance. This latest research evaluated how effective the different levels of robotic assistance were in aiding non-specialist staff to carry out the procedure.
    Levels of robotic assistance
    Direct robot control. This is where the operator has direct control of the robot via a joystick. In this case, there is no assistance.
    Intelligent endoscope teleoperation. The operator focuses on where they want the capsule to be located in the colon, leaving the robotic system to calculate the movements of the robotic arm necessary to get the capsule into place.
    Semi-autonomous navigation. The robotic system autonomously navigates the capsule through the colon, using computer vision — although this can be overridden by the operator.
    During a laboratory simulation, 10 non-expert staff were asked to get the capsule to a point within the colon within 20 minutes. They did that five times, using the three different levels of assistance.
    Using direct robot control, the participants had a 58% success rate. That increased to 96% using intelligent endoscope teleoperation — and 100% using semi-autonomous navigation.
    In the next stage of the experiment, two participants were asked to navigate a conventional colonoscope into the colon of two anaesthetised pigs — and then to repeat the task with the magnet-controlled robotic system using the different levels of assistance. A vet was in attendance to ensure the animals were not harmed.
    The participants were scored on the NASA Task Load Index, a measure of how taxing a task was, both physically and mentally.
    The NASA Task Load Index revealed that they found it easier to operate the colonoscope with robotic assistance. A sense of frustration was a major factor in operating the conventional colonoscope and where participants had direct control of the robot.
    James Martin, a PhD researcher from the University of Leeds who co-led the study, said: “Operating the robotic arm is challenging. It is not very intuitive and that has put a brake on the development of magnetic flexible colonoscopes.
    “But we have demonstrated for the first time that it is possible to offload that function to the robotic system, leaving the operator to think about the clinical task they are undertaking — and it is making a measurable difference in human performance.”
    The techniques developed to conduct colonoscopy examinations could be applied to other endoscopic devices, such as those used to inspect the upper digestive tract or lungs.
    Dr Bruno Scaglioni, a Postdoctoral Research Fellow at Leeds and co-leader of the study, added: “Robot-assisted colonoscopy has the potential to revolutionize the way the procedure is carried out. It means people conducting the examination do not need to be experts in manipulating the device.
    “That will hopefully make the technique more widely available, where it could be offered in clinics and health centres rather than hospitals.” More

  • in

    Liquid metals come to the rescue of semiconductors

    Moore’s law is an empirical suggestion describing that the number of transistors doubles every few years in integrated circuits (ICs). However, Moore’s law has started to fail as transistors are now so small that the current silicon-based technologies are unable to offer further opportunities for shrinking.
    One possibility of overcoming Moore’s law is to resort to two-dimensional semiconductors. These two-dimensional materials are so thin that they can allow the propagation of free charge carriers, namely electrons and holes in transistors that carry the information, along an ultra-thin plane. This confinement of charge carriers can potentially allow the switching of the semiconductor very easily. It also allows directional pathways for the charge carriers to move without scattering and therefore leading to infinitely small resistance for the transistors. This means in theory the two-dimensional materials can result in transistors that do not waste energy during their on/off switching. Theoretically, they can switch very fast and also switch off to absolute zero resistance values during their non-operational states. Sounds ideal, but life is not ideal! In reality, there are still many technological barriers that should be surpassed for creating such perfect ultra-thin semiconductors. One of the barriers with the current technologies is that the deposited ultra-thin films are full of grain boundaries so that the charge carriers are bounced back from them and hence the resistive loss increases.
    One of the most exciting ultra-thin semiconductors is molybdenum disulphide (MoS2) which has been the subject of investigation for the past two decades for its electronic properties. However, obtaining very large-scale two-dimensional MoS2 without any grain boundaries has been proven to be a real challenge. Using any current large-scale deposition technologies, grain-boundary-free MoS2 which is essential for making ICs has yet been reached with acceptable maturity. However, now researchers at the School of Chemical Engineering, University of New South Wales (UNSW) have developed a method to eliminate such grain boundaries based on a new deposition approach.
    “This unique capability was achieved with the help of gallium metal in its liquid state. Gallium is an amazing metal with a low melting point of only 29.8 °C. It means that at a normal office temperature it is solid, while it turns into a liquid when placed at the palm of someone’s hand. It is a melted metal, so its surface is atomically smooth. It is also a conventional metal which means that its surface provides a large number of free electrons for facilitating chemical reactions.” Ms Yifang Wang, the first author of the paper said.
    “By bringing the sources of molybdenum and sulphur near the surface of gallium liquid metal, we were able to realize chemical reactions that form the molybdenum sulphur bonds to establish the desired MoS2. The formed two-dimensional material is templated onto an atomically smooth surface of gallium, so it is naturally nucleated and grain boundary free. This means that by a second step annealing, we were able to obtain very large area MoS2 with no grain boundary. This is a very important step for scaling up this fascinating ultra-smooth semiconductor.” Prof Kourosh Kalantar-Zadeh, the leading author of the work said.
    The researchers at UNSW are now planning to expand their methods to creating other two-dimensional semiconductors and dielectric materials in order to create a number of materials that can be used as different parts of transistors.

    Story Source:
    Materials provided by ARC Centre of Excellence in Future Low-Energy Electronics Technologies. Note: Content may be edited for style and length. More

  • in

    New virtual reality software allows scientists to 'walk' inside cells

    Virtual reality software which allows researchers to ‘walk’ inside and analyse individual cells could be used to understand fundamental problems in biology and develop new treatments for disease.
    The software, called vLUME, was created by scientists at the University of Cambridge and 3D image analysis software company Lume VR Ltd. It allows super-resolution microscopy data to be visualised and analysed in virtual reality, and can be used to study everything from individual proteins to entire cells. Details are published in the journal Nature Methods.
    Super-resolution microscopy, which was awarded the Nobel Prize for Chemistry in 2014, makes it possible to obtain images at the nanoscale by using clever tricks of physics to get around the limits imposed by light diffraction. This has allowed researchers to observe molecular processes as they happen. However, a problem has been the lack of ways to visualise and analyse this data in three dimensions.
    “Biology occurs in 3D, but up until now it has been difficult to interact with the data on a 2D computer screen in an intuitive and immersive way,” said Dr Steven F. Lee from Cambridge’s Department of Chemistry, who led the research. “It wasn’t until we started seeing our data in virtual reality that everything clicked into place.”
    The vLUME project started when Lee and his group met with the Lume VR founders at a public engagement event at the Science Museum in London. While Lee’s group had expertise in super-resolution microscopy, the team from Lume specialised in spatial computing and data analysis, and together they were able to develop vLUME into a powerful new tool for exploring complex datasets in virtual reality.
    “vLUME is revolutionary imaging software that brings humans into the nanoscale,” said Alexandre Kitching, CEO of Lume. “It allows scientists to visualise, question and interact with 3D biological data, in real time all within a virtual reality environment, to find answers to biological questions faster. It’s a new tool for new discoveries.”
    Viewing data in this way can stimulate new initiatives and ideas. For example, Anoushka Handa — a PhD student from Lee’s group — used the software to image an immune cell taken from her own blood, and then stood inside her own cell in virtual reality. “It’s incredible — it gives you an entirely different perspective on your work,” she said.
    The software allows multiple datasets with millions of data points to be loaded in and finds patterns in the complex data using in-built clustering algorithms. These findings can then be shared with collaborators worldwide using image and video features in the software.
    “Data generated from super-resolution microscopy is extremely complex,” said Kitching. “For scientists, running analysis on this data can be very time consuming. With vLUME, we have managed to vastly reduce that wait time allowing for more rapid testing and analysis.”
    The team are mostly using vLUME with biological datasets, such as neurons, immune cells or cancer cells. For example, Lee’s group has been studying how antigen cells trigger an immune response in the body. “Through segmenting and viewing the data in vLUME, we’ve quickly been able to rule out certain hypotheses and propose new ones,” said Lee. This software allows researchers to explore, analyse, segment and share their data in new ways. All you need is a VR headset.”

    Story Source:
    Materials provided by University of Cambridge. The original story is licensed under a Creative Commons License. Note: Content may be edited for style and length. More

  • in

    Multi-state data storage leaving binary behind

    Electronic data is being produced at a breath-taking rate.
    The total amount of data stored in data centres around the globe is of the order of ten zettabytes (a zettabyte is a trillion gigabytes), and we estimate that amount doubles every couple of years.
    With 8% of global electricity already being consumed in information and communication technology (ICT), low-energy data-storage is a key priority.
    To date there is no clear winner in the race for next-generation memory that is non-volatile, has great endurance, highly energy efficient, low cost, high density, and allows fast access operation.
    The joint international team comprehensively reviews ‘multi-state memory’ data storage, which steps ‘beyond binary’ to store more data than just 0s and 1s.
    MULTI-STATE MEMORY: MORE THAN JUST ZEROES AND ONES
    Multi-state memory is an extremely promising technology for future data storage, with the ability to store data in more than a single bit (ie, 0 or 1) allowing much higher storage density (amount of data stored per unit area.

    advertisement

    This circumvents the plateauing of benefits historically offered by ‘Moore’s Law’, where component size halved abut every two years. In recent years, the long-predicted plateauing of Moore’s Law has been observed, with charge leakage and spiralling research and fabrication costs putting the nail in the Moore’s Law coffin.
    Non-volatile, multi-state memory (NMSM) offers energy efficiency, high, nonvolatility, fast access, and low cost.
    Storage density is dramatically enhanced without scaling down the dimensions of the memory cell, making memory devices more efficient and less expensive.
    NEUROMORPHIC COMPUTER MIMICKING THE HUMAN BRAIN
    Multi-state memory also enables the proposed future technology neuromorphic computing, which would mirror the structure of the human brain. This radically-different, brain-inspired computing regime could potentially provide the economic impetus for adoption of a novel technology such as NMSM.
    NMSMs allow analog calculation, which could be vital to intelligent, neuromorphic networks, as well as potentially helping us finally unravel the working mechanism of the human brain itself.
    THE STUDY
    The paper reviews device architectures, working mechanisms, material innovation, challenges, and recent progress for leading NMSM candidates, including:
    Flash memory
    magnetic random-access memory (MRAM)
    resistive random-access memory (RRAM)
    ferroelectric random-access memory (FeRAM)
    phase-change memory (PCM) More

  • in

    New project to build nano-thermometers could revolutionize temperature imaging

    Cheaper refrigerators? Stronger hip implants? A better understanding of human disease? All of these could be possible and more, someday, thanks to an ambitious new project underway at the National Institute of Standards and Technology (NIST).
    NIST researchers are in the early stages of a massive undertaking to design and build a fleet of tiny ultra-sensitive thermometers. If they succeed, their system will be the first to make real-time measurements of temperature on the microscopic scale in an opaque 3D volume — which could include medical implants, refrigerators, and even the human body.
    The project is called Thermal Magnetic Imaging and Control (Thermal MagIC), and the researchers say it could revolutionize temperature measurements in many fields: biology, medicine, chemical synthesis, refrigeration, the automotive industry, plastic production — “pretty much anywhere temperature plays a critical role,” said NIST physicist Cindi Dennis. “And that’s everywhere.”
    The NIST team has now finished building its customized laboratory spaces for this unique project and has begun the first major phase of the experiment.
    Thermal MagIC will work by using nanometer-sized objects whose magnetic signals change with temperature. The objects would be incorporated into the liquids or solids being studied — the melted plastic that might be used as part of an artificial joint replacement, or the liquid coolant being recirculated through a refrigerator. A remote sensing system would then pick up these magnetic signals, meaning the system being studied would be free from wires or other bulky external objects.
    The final product could make temperature measurements that are 10 times more precise than state-of-the-art techniques, acquired in one-tenth the time in a volume 10,000 times smaller. This equates to measurements accurate to within 25 millikelvin (thousandths of a kelvin) in as little as a tenth of a second, in a volume just a hundred micrometers (millionths of a meter) on a side. The measurements would be “traceable” to the International System of Units (SI); in other words, its readings could be accurately related to the fundamental definition of the kelvin, the world’s basic unit of temperature.

    advertisement

    The system aims to measure temperatures over the range from 200 to 400 kelvin (K), which is about -99 to 260 degrees Fahrenheit (F). This would cover most potential applications — at least the ones the Thermal MagIC team envisions will be possible within the next 5 years. Dennis and her colleagues see potential for a much larger temperature range, stretching from 4 K-600 K, which would encompass everything from supercooled superconductors to molten lead. But that is not a part of current development plans.
    “This is a big enough sea change that we expect that if we can develop it — and we have confidence that we can — other people will take it and really run with it and do things that we currently can’t imagine,” Dennis said.
    Potential applications are mostly in research and development, but Dennis said the increase in knowledge would likely trickle down to a variety of products, possibly including 3D printers, refrigerators, and medicines.
    What Is It Good For?
    Whether it’s the thermostat in your living room or a high-precision standard instrument that scientists use for laboratory measurements, most thermometers used today can only measure relatively big areas — on a macroscopic as opposed to microscopic level. These conventional thermometers are also intrusive, requiring sensors to penetrate the system being measured and to connect to a readout system by bulky wires.

    advertisement

    Infrared thermometers, such as the forehead instruments used at many doctors’ offices, are less intrusive. But they still only make macroscopic measurements and cannot see beneath surfaces.
    Thermal MagIC should let scientists get around both these limitations, Dennis said.
    Engineers could use Thermal MagIC to study, for the first time, how heat transfer occurs within different coolants on the microscale, which could aid their quest to find cheaper, less energy-intensive refrigeration systems.
    Doctors could use Thermal MagIC to study diseases, many of which are associated with temperature increases — a hallmark of inflammation — in specific parts of the body.
    And manufacturers could use the system to better control 3D printing machines that melt plastic to build custom objects such as medical implants and prostheses. Without the ability to measure temperature on the microscale, 3D printing developers are missing crucial information about what’s going on inside the plastic as it solidifies into an object. More knowledge could improve the strength and quality of 3D-printed materials someday, by giving engineers more control over the 3D printing process.
    Giving It OOMMF
    The first step in making this new thermometry system is creating nano-sized magnets that will give off strong magnetic signals in response to temperature changes. To keep particle concentrations as low as possible, the magnets will need to be 10 times more sensitive to temperature changes than any objects that currently exist.
    To get that kind of signal, Dennis said, researchers will likely need to use multiple magnetic materials in each nano-object. A core of one substance will be surrounded by other materials like the layers of an onion.
    The trouble is that there are practically endless combinations of properties that can be tweaked, including the materials’ composition, size, shape, the number and thickness of the layers, or even the number of materials. Going through all of these potential combinations and testing each one for its effect on the object’s temperature sensitivity could take multiple lifetimes to accomplish.
    To help them get there in months instead of decades, the team is turning to sophisticated software: the Object Oriented MicroMagnetic Framework (OOMMF), a widely used modeling program developed by NIST researchers Mike Donahue and Don Porter.
    The Thermal MagIC team will use this program to create a feedback loop. NIST chemists Thomas Moffat, Angela Hight Walker and Adam Biacchi will synthesize new nano-objects. Then Dennis and her team will characterize the objects’ properties. And finally, Donahue will help them feed that information into OOMMF, which will make predictions about what combinations of materials they should try next.
    “We have some very promising results from the magnetic nano-objects side of things, but we’re not quite there yet,” Dennis said.
    Each Dog Is a Voxel
    So how do they measure the signals given out by tiny concentrations of nano-thermometers inside a 3D object in response to temperature changes? They do it with a machine called a magnetic particle imager (MPI), which surrounds the sample and measures a magnetic signal coming off the nanoparticles.
    Effectively, they measure changes to the magnetic signal coming off one small volume of the sample, called a “voxel” — basically a 3D pixel — and then scan through the entire sample one voxel at a time.
    But it’s hard to focus a magnetic field, said NIST physicist Solomon Woods. So they achieve their goal in reverse.
    Consider a metaphor. Say you have a dog kennel, and you want to measure how loud each individual dog is barking. But you only have one microphone. If multiple dogs are barking at once, your mic will pick up all of that sound, but with only one mic you won’t be able to distinguish one dog’s bark from another’s.
    However, if you could quiet each dog somehow — perhaps by occupying its mouth with a bone — except for a single cocker spaniel in the corner, then your mic would still be picking up all the sounds in the room, but the only sound would be from the cocker spaniel.
    In theory, you could do this with each dog in sequence — first the cocker spaniel, then the mastiff next to it, then the labradoodle next in line — each time leaving just one dog bone-free.
    In this metaphor, each dog is a voxel.
    Basically, the researchers max out the ability of all but one small volume of their sample to respond to a magnetic field. (This is the equivalent of stuffing each dog’s mouth with a delicious bone.) Then, measuring the change in magnetic signal from the entire sample effectively lets you measure just that one little section.
    MPI systems similar to this exist but are not sensitive enough to measure the kind of tiny magnetic signal that would come from a small change in temperature. The challenge for the NIST team is to boost the signal significantly.
    “Our instrumentation is very similar to MPI, but since we have to measure temperature, not just measure the presence of a nano-object, we essentially need to boost our signal-to-noise ratio over MPI by a thousand or 10,000 times,” Woods said.
    They plan to boost the signal using state-of-the-art technologies. For example, Woods may use superconducting quantum interference devices (SQUIDs), cryogenic sensors that measure extremely subtle changes in magnetic fields, or atomic magnetometers, which detect how energy levels of atoms are changed by an external magnetic field. Woods is working on which are best to use and how to integrate them into the detection system.
    The final part of the project is making sure the measurements are traceable to the SI, a project led by NIST physicist Wes Tew. That will involve measuring the nano-thermometers’ magnetic signals at different temperatures that are simultaneously being measured by standard instruments.
    Other key NIST team members include Thinh Bui, Eric Rus, Brianna Bosch Correa, Mark Henn, Eduardo Correa and Klaus Quelhas.
    Before finishing their new laboratory space, the researchers were able to complete some important work. In a paper published last month in the International Journal on Magnetic Particle Imaging, the group reported that they had found and tested a “promising” nanoparticle material made of iron and cobalt, with temperature sensitivities that varied in a controllable way depending on how the team prepared the material. Adding an appropriate shell material to encase this nanoparticle “core” would bring the team closer to creating a working temperature-sensitive nanoparticle for Thermal MagIC.
    In the past few weeks, the researchers have made further progress testing combinations of materials for the nanoparticles.
    “Despite the challenge of working during the pandemic, we have had some successes in our new labs,” Woods said. “These achievements include our first syntheses of multi-layer nanomagnetic systems for thermometry, and ultra-stable magnetic temperature measurements using techniques borrowed from atomic clock research.” More

  • in

    'Universal law of touch' will enable new advances in virtual reality

    Seismic waves, commonly associated with earthquakes, have been used by scientists to develop a universal scaling law for the sense of touch. A team, led by researchers at the University of Birmingham, used Rayleigh waves to create the first scaling law for touch sensitivity. The results are published in Science Advances.
    The researchers are part of a European consortium (H-Reality) that are already using the theory to develop new Virtual Reality technologies that incorporate the sense of touch.
    Rayleigh waves are created by impact between objects and are commonly thought to travel only along surfaces. The team discovered that, when it comes to touch, the waves also travel through layers of skin and bone and are picked up by the body’s touch receptor cells.
    Using mathematical modelling of these touch receptors the researchers showed how the receptors were located at depths that allowed them to respond to Rayleigh waves. The interaction of these receptors with the Rayleigh waves will vary across species, but the ratio of receptor depth vs wavelength remains the same, enabling the universal law to be defined.
    The mathematics used by the researchers to develop the law is based on approaches first developed over a hundred years ago to model earthquakes. The law supports predictions made by the Nobel-Prize-winning physicist Georg von Békésy who first suggested the mathematics of earthquakes could be used to explore connections between Rayleigh waves and touch.
    The team also found that the interaction of the waves and receptors remained even when the stiffness of the outermost layer of skin changed. The ability of the receptors to respond to Rayleigh waves remained unchanged despite the many variations in this outer layer caused by, age, gender, profession, or even hydration.
    Dr Tom Montenegro-Johnson, of the University of Birmingham’s School of Mathematics, led the research. He explains: “Touch is a primordial sense, as important to our ancient ancestors as it is to modern day mammals, but it’s also one of the most complex and therefore least understood. While we have universal laws to explain sight and hearing, for example, this is the first time that we’ve been able to explain touch in this way.”
    James Andrews, co-author of the study at the University of Birmingham, adds: “The principles we’ve defined enable us to better understand the different experiences of touch among a wide range of species. For example, if you indent the skin of a rhinoceros by 5mm, they would have the same sensation as a human with a similar indentation — it’s just that the forces required to produce the indentation would be different. This makes a lot of sense in evolutionary terms, since it’s connected to relative danger and potential damage.”
    The work was funded by the European Union’s Horizon 2020 research and innovation programme, under collaborative project “H-Reality.” The other institutions involved in the project are Ultraleap Ltd. (UK), Actronika (France), TU Delft (The Netherlands), and CNRS (France).

    Story Source:
    Materials provided by University of Birmingham. Note: Content may be edited for style and length. More

  • in

    Fundamental constants place a new speed limit on sound

    Sound has a speed limit. Under normal circumstances, its waves can travel no faster than about 36 kilometers per second, physicists propose October 9 in Science Advances.
    Sound zips along at different rates in different materials — moving faster in water than in air for example. But under conditions found naturally on Earth, no material can host sound waves that outpace this ultimate limit, which is about 100 times the typical speed of sound traveling in air.
    The team’s reasoning rests on well-known equations of physics and mathematical relationships.  “Given the simplicity of the argument, it suggests that [the researchers] are putting their finger on something very deep,” says condensed matter physicist Kamran Behnia of École Supérieure de Physique et de Chimie Industrielles in Paris.
    The equation for the speed limit rests on fundamental constants, special numbers that rule the cosmos. One such number, the speed of light, sets the universe’s ultimate speed limit — nothing can go faster. Another, known as the fine-structure constant, determines the strength with which electrically charged particles push and pull one another. When combined in the right arrangement with another constant — the ratio of the masses of the proton and electron — these numbers yield sound’s speed limit.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Sound waves, which consist of the vibrations of atoms or molecules, travel through a material as one particle jostles another. The wave’s speed depends on various factors, including the types of chemical bonds holding the material together and how massive its atoms are.
    None of the sound speeds previously measured in a variety of liquids and solids surpass the proposed limit, condensed matter physicist Kostya Trachenko and colleagues found. The fastest speed measured, in diamond, was only about half the theoretical maximum.  
    The limit applies only to solids and liquids at pressures typically found on Earth. At pressures millions of times that of Earth’s atmosphere, sound waves move faster and could surpass the limit.
    One material expected to boast a high sound speed exists only at such high pressures: hydrogen squeezed hard enough to turn into a solid metal (SN: 6/28/19). That metal has never been convincingly created, so the researchers calculated the expected speed instead of using a measurement. Above about 6 million times Earth’s atmospheric pressure, the sound speed limit would be broken, the calculations suggest.
    The role of the fundamental constants in sound’s maximum speed results from how the waves move through materials. Sound travels thanks to the electromagnetic interactions of neighboring atoms’ electrons, which is where the fine-structure constant comes into play. And the proton-electron mass ratio is important because, although the electrons are interacting, the nuclei of the atoms move as a result.
    The fine-structure constant and the proton-electron mass ratio are dimensionless constants, meaning there are no units attached to them (so their value does not depend on any particular system of units). Such dimensionless constants fascinate physicists, because the values are crucial to the existence of the universe as we know it (SN: 11/2/16). For example, if the fine-structure constant were significantly altered, stars, planets and life couldn’t have formed. But no one can explain why these all-important numbers have the values they do.
    “When I have sleepless nights, I sometimes think about this,” says Trachenko, of Queen Mary University of London. So he and colleagues are extending this puzzle from the cosmic realm to more commonplace concepts like the speed of sound. Trachenko and coauthor Vadim Veniaminovich Brazhkin of the Institute for High Pressure Physics, in Troitsk, Russia, also reported a minimum possible viscosity for liquids in the April 24 Science Advances.
    That viscosity limit depends on the Planck constant, a number at the heart of quantum mechanics, the math that governs physics on very small scales. If the Planck constant were 100 times larger, Trachenko says, “water would be like honey, and that probably would be the end of life because the processes in cells would not flow as efficiently.” More