More stories

  • in

    Superconductors are super resilient to magnetic fields

    A researcher at the University of Tsukuba has offered a new explanation for how superconductors exposed to a magnetic field can recover — without loss of energy — to their previous state after the field is removed. This work may lead to a new theory of superconductivity and a more eco-friendly electrical distribution system.
    Superconductors are a class of materials with the amazing property of being able to conduct electricity with zero resistance. In fact, an electrical current can circle around a loop of superconducting wire indefinitely. The catch is that these materials must be kept very cold, and even so, a strong magnetic field can cause a superconductor to revert back to normal.
    It was once assumed that the superconducting-to-normal transition caused by a magnetic field could not be reversed easily, since the energy would be dissipated by the usual process of Joule heating. This mechanism, by which the resistance in normal wires converts electrical energy into heat, is what allows us to use an electric stovetop or space heater.
    “Joule heating is usually considered negatively, because it wastes energy and can even cause overloaded wires to melt,” explains Professor Hiroyasu Koizumi of the Division of Quantum Condensed Matter Physics, the Center for Computational Sciences at the University of Tsukuba. “However, it has been known for a long time from experiments that, if you remove the magnetic field, a current-carrying superconductor can, in fact, be returned to its previous state without loss of energy,”
    Now, Professor Koizumi has proposed a new explanation for this phenomenon. In the superconducting state, elections pair up and move in sync, but the true cause of this synchronized motion is the presence of so-called “Berry connection,” characterized by the topological quantum number. It is an integer and if it is nonzero, current flows. Thus, this supercurrent can be switched off abruptly by changing this number to zero without Joule heating.
    The founder of modern electromagnetic theory, James Clerk Maxwell, once postulated a similar molecular vortex model that imagined space being filled with the rotation of currents in tiny circles. Since everything was spinning the same way, it reminded Maxwell of “idle wheels,” which were gears used in machines for this purpose.
    “The surprising thing is that a model from the early days of electromagnetism, like Maxwell’s idle wheels, can help us resolve questions arising today,” Professor Koizumi says. “This research may help lead to a future in which energy can be delivered from power plants to homes with perfect efficiency.”

    Story Source:
    Materials provided by University of Tsukuba. Note: Content may be edited for style and length. More

  • in

    Positive results for ReWalk ReStore exosuit in stroke rehabilitation

    A team of U.S. researchers published the results of a multi-center, single-arm trial of the ReWalk ReStore™ for gait training in individuals undergoing post-stroke rehabilitation. They found the device safe and reliable during treadmill and overground walking under the supervision of physical therapists. The article, “The ReWalk ReStore soft robotic exosuit: a multisite clinical trial of the safety, reliability, and feasibility of exosuit-augmented post-stroke gait rehabilitation,” was published open access in the Journal of NeuroEngineering and Rehabilitation on June 18, 2020.
    The authors are the principal investigators of each of the five testing sites: Louis N. Awad, PT, DPT, PhD, of Spaulding Rehabilitation Hospital, Boston, MA, Alberto Esquenazi, MD, of MossRehab Stroke and Neurological Disease Center, Elkins Park, PA, Gerard E. Francisco, MD, of TIRR Memorial Hermann, Houston, TX, Karen J, Nolan, PhD, of Kessler Foundation, West Orange, NJ, and lead investigator Arun Jayaramam, PT, PhD, of the Shirley Ryan AbilityLab, Chicago, IL.
    The ReStore™ exosuit (ReWalk Robotics, Ltd) is the first soft robotic exosuit cleared by the FDA for use in stroke survivors with mobility deficits. The device is indicated for individuals with hemiplegia undergoing stroke rehabilitation under the care of licensed physical therapists. Hemiplegia causes weakness of the ankle, limiting the ability to clear the ground during stepping and hindering forward movement. This leads to compensatory walking patterns that increase effort and decrease stability.
    ReStore is designed to augment ankle plantarflexion and dorsiflexion, allowing a more normal gait pattern. Motors mounted on a waist belt transmit power through cables to attachment points on an insole and the patient’s calf. Sensors clipped to the patient’s shoes transmit data to a handheld smartphone controller used by a trained therapist to adjust levels of assistance and monitor and record key metrics of gait training.
    The trial enrolled 44 participants with post stroke hemiparesis who were able to walk unassisted for 5 feet. The protocol consisted of 5 days of 20-minute sessions of treadmill and overground training under the supervision of licensed physical therapists. To assess the therapeutic potential for ReStore in rehabilitation, the researchers also explored the effects of the device on maximum walking speed, measuring participants’ walking speed in and out of the device using the 10-m walk test, before and after the five training visits. For safety purposes, some participants were allowed to use an AFO or cane during walking sessions.
    The trial determined the safety, reliability, and feasibility of the device in this stroke population. “We found that the ReStore provided targeted assistance for plantarflexion and dorsiflexion of the paretic ankle, improving the gait pattern,” explained Dr. Nolan, senior research scientist in the Center for Mobility and Rehabilitation Engineering Research at Kessler Foundation. “This is an important first step toward expanding options for rehabilitative care for the millions of individuals with mobility impairments caused by ischemic and hemorrhagic stroke.”
    The trial’s exploratory data indicated positive effects of the training on the walking speed of participants during exosuit-assisted walking and unassisted walking (walking without the device). More than one third of participants achieved a significant increase in unassisted walking speed, indicating that further research is warranted.
    Dr. Nolan emphasized that the trial was not designed to measure the device’s efficacy: “Controlled trials are needed to determine the efficacy of ReStore for improving mobility outcomes of stroke rehabilitation.”

    Story Source:
    Materials provided by Kessler Foundation. Note: Content may be edited for style and length. More

  • in

    Vibration device makes homes 'smart' by tracking appliances

    To boost efficiency in typical households — where people forget to take wet clothes out of washing machines, retrieve hot food from microwaves and turn off dripping faucets — Cornell University researchers have developed a single device that can track 17 types of appliances using vibrations.
    The device, called VibroSense, uses lasers to capture subtle vibrations in walls, ceilings and floors, as well as a deep learning network that models the vibrometer’s data to create different signatures for each appliance — bringing researchers closer to a more efficient and integrated smart home.
    “Recognizing home activities can help computers better understand human behaviors and needs, with the hope of developing a better human-machine interface,” said Cheng Zhang, assistant professor of information science and senior author of “VibroSense: Recognizing Home Activities by Deep Learning Subtle Vibrations on an Interior Surface of a House from a Single Point Using Laser Doppler Vibrometry.” The paper was published in Proceedings of the Association for Computing Machinery on Interactive, Mobile, Wearable and Ubiquitous Technologies and will be presented at the ACM International Joint Conference on Pervasive and Ubiquitous Computing, which will be held virtually Sept. 12-17.
    “In order to have a smart home at this point, you’d need each device to be smart, which is not realistic; or you’d need to install separate sensors on each device or in each area,” said Zhang, who directs Cornell’s SciFi Lab. “Our system is the first that can monitor devices across different floors, in different rooms, using one single device.”
    In order to detect usage across an entire house, the researchers’ task was twofold: detect tiny vibrations using a laser Doppler vibrometer; and differentiate similar vibrations created by multiple devices by identifying the paths traveled by the vibrations from room to room.
    The deep learning network was trained to distinguish different activities, partly by learning path signatures — the distinctive path vibrations followed through the house — as well as their distinct noises.
    The device showed nearly 96% accuracy in identifying 17 different activities across five houses — including dripping faucets, an exhaust fan, an electric kettle, a refrigerator and a range hood — in five houses over two days, according to the paper. VibroSense could also distinguish five different stages of appliance usage with an average accuracy more than 97%.
    In single-story houses, the laser was pointed at an interior wall at the center of the home. It was pointed at the ceiling in two-story homes.
    The device is primarily useful in single-family houses, Zhang said, because in buildings it could pick up activities in neighboring apartments, presenting a potential privacy risk.
    “It would definitely require collaboration between researches, industry practitioners and government to make sure this was used for the right purposes,” Zhang said.
    Among other uses, the system could help homes monitor energy usage and potentially help reduce consumption.
    “Since our system can detect both the occurrence of an indoor event, as well as the time of an event, it could be used to estimate electricity and water-usage rates, and provide energy-saving advice for homeowners,” Zhang said. “It could also prevent water and electrical waste, as well as electrical failures such as short circuits in home appliances.”

    Story Source:
    Materials provided by Cornell University. Original written by Melanie Lefkowitz. Note: Content may be edited for style and length. More

  • in

    Designed antiviral proteins inhibit SARS-CoV-2 in the lab

    Computer-designed small proteins have now been shown to protect lab-grown human cells from SARS-CoV-2, the coronavirus that causes COVID-19.
    The findings are reported today, Sept. 9, in Science
    In the experiments, the lead antiviral candidate, named LCB1, rivaled the best-known SARS-CoV-2 neutralizing antibodies in its protective actions. LCB1 is currently being evaluated in rodents.
    Coronaviruses are studded with so-called Spike proteins. These latch onto human cells to enable the virus to break in and infect them. The development of drugs that interfere with this entry mechanism could lead to treatment of or even prevention of infection.
    Institute for Protein Design researchers at the University of Washington School of Medicine used computers to originate new proteins that bind tightly to SARS-CoV-2 Spike protein and obstruct it from infecting cells.
    Beginning in January, more than two million candidate Spike-binding proteins were designed on the computer. Over 118,000 were then produced and tested in the lab.

    advertisement

    “Although extensive clinical testing is still needed, we believe the best of these computer-generated antivirals are quite promising,” said lead author Longxing Cao, a postdoctoral scholar at the Institute for Protein Design.
    “They appear to block SARS-CoV-2 infection at least as well as monoclonal antibodies, but are much easier to produce and far more stable, potentially eliminating the need for refrigeration,” he added.
    The researchers created antiviral proteins through two approaches. First, a segment of the ACE2 receptor, which SARS-CoV-2 naturally binds to on the surface of human cells, was incorporated into a series of small protein scaffolds.
    Second, completely synthetic proteins were designed from scratch. The latter method produced the most potent antivirals, including LCB1, which is roughly six times more potent on a per mass basis than the most effective monoclonal antibodies reported thus far.
    Scientists from the University of Washington School of Medicine in Seattle and Washington University School of Medicine in St. Louis collaborated on this work.
    “Our success in designing high-affinity antiviral proteins from scratch is further proof that computational protein design can be used to create promising drug candidates,” said senior author and Howard Hughes Medical Institute Investigator David Baker, professor of biochemistry at the UW School of Medicine and head of the Institute for Protein Design. In 2019, Baker gave a TED talk on how protein design might be used to stop viruses.
    To confirm that the new antiviral proteins attached to the coronavirus Spike protein as intended, the team collected snapshots of the two molecules interacting by using cryo-electron microscopy. These experiments were performed by researchers in the laboratories of David Veesler, assistant professor of biochemistry at the UW School of Medicine, and Michael S. Diamond, the Herbert S. Gasser Professor in the Division of Infectious Diseases at Washington University School of Medicine in St. Louis.
    “The hyperstable minibinders provide promising starting points for new SARS-CoV-2 therapeutics,” the antiviral research team wrote in their study pre-print, “and illustrate the power of computational protein design for rapidly generating potential therapeutic candidates against pandemic threats.”

    Story Source:
    Materials provided by University of Washington Health Sciences/UW Medicine. Original written by Ian Haydon, Institute for Protein Design. Note: Content may be edited for style and length. More

  • in

    Seeing objects through clouds and fog

    Like a comic book come to life, researchers at Stanford University have developed a kind of X-ray vision — only without the X-rays. Working with hardware similar to what enables autonomous cars to “see” the world around them, the researchers enhanced their system with a highly efficient algorithm that can reconstruct three-dimensional hidden scenes based on the movement of individual particles of light, or photons. In tests, detailed in a paper published Sept. 9 in Nature Communications, their system successfully reconstructed shapes obscured by 1-inch-thick foam. To the human eye, it’s like seeing through walls.
    “A lot of imaging techniques make images look a little bit better, a little bit less noisy, but this is really something where we make the invisible visible,” said Gordon Wetzstein, assistant professor of electrical engineering at Stanford and senior author of the paper. “This is really pushing the frontier of what may be possible with any kind of sensing system. It’s like superhuman vision.”
    This technique complements other vision systems that can see through barriers on the microscopic scale — for applications in medicine — because it’s more focused on large-scale situations, such as navigating self-driving cars in fog or heavy rain and satellite imaging of the surface of Earth and other planets through hazy atmosphere.
    Supersight from scattered light
    In order to see through environments that scatter light every-which-way, the system pairs a laser with a super-sensitive photon detector that records every bit of laser light that hits it. As the laser scans an obstruction like a wall of foam, an occasional photon will manage to pass through the foam, hit the objects hidden behind it and pass back through the foam to reach the detector. The algorithm-supported software then uses those few photons — and information about where and when they hit the detector — to reconstruct the hidden objects in 3D.
    This is not the first system with the ability to reveal hidden objects through scattering environments, but it circumvents limitations associated with other techniques. For example, some require knowledge about how far away the object of interest is. It is also common that these systems only use information from ballistic photons, which are photons that travel to and from the hidden object through the scattering field but without actually scattering along the way.

    advertisement

    “We were interested in being able to image through scattering media without these assumptions and to collect all the photons that have been scattered to reconstruct the image,” said David Lindell, a graduate student in electrical engineering and lead author of the paper. “This makes our system especially useful for large-scale applications, where there would be very few ballistic photons.”
    In order to make their algorithm amenable to the complexities of scattering, the researchers had to closely co-design their hardware and software, although the hardware components they used are only slightly more advanced than what is currently found in autonomous cars. Depending on the brightness of the hidden objects, scanning in their tests took anywhere from one minute to one hour, but the algorithm reconstructed the obscured scene in real-time and could be run on a laptop.
    “You couldn’t see through the foam with your own eyes, and even just looking at the photon measurements from the detector, you really don’t see anything,” said Lindell. “But, with just a handful of photons, the reconstruction algorithm can expose these objects — and you can see not only what they look like, but where they are in 3D space.”
    Space and fog
    Someday, a descendant of this system could be sent through space to other planets and moons to help see through icy clouds to deeper layers and surfaces. In the nearer term, the researchers would like to experiment with different scattering environments to simulate other circumstances where this technology could be useful.
    “We’re excited to push this further with other types of scattering geometries,” said Lindell. “So, not just objects hidden behind a thick slab of material but objects that are embedded in densely scattering material, which would be like seeing an object that’s surrounded by fog.”
    Lindell and Wetzstein are also enthusiastic about how this work represents a deeply interdisciplinary intersection of science and engineering.
    “These sensing systems are devices with lasers, detectors and advanced algorithms, which puts them in an interdisciplinary research area between hardware and physics and applied math,” said Wetzstein. “All of those are critical, core fields in this work and that’s what’s the most exciting for me.”

    Story Source:
    Materials provided by Stanford University. Original written by Taylor Kubota. Note: Content may be edited for style and length. More

  • in

    As collegiate esports become more professional, women are being left out

    A new study from North Carolina State University reports that the rapidly growing field of collegiate esports is effectively becoming a two-tiered system, with club-level programs that are often supportive of gender diversity being clearly distinct from well-funded varsity programs that are dominated by men.
    “Five years ago, we thought collegiate esports might be an opportunity to create a welcoming, diverse competitive arena, which was a big deal given how male-dominated the professional esports scene was,” says Nick Taylor, co-author of the study and an associate professor of communication at NC State. “Rapid growth of collegiate esports over the past five years has led to it becoming more professional, with many universities having paid esports positions, recruiting players, and so on. We wanted to see how that professionalization has affected collegiate esports and what that means for gender diversity. The findings did not give us reason to be optimistic.”
    For this qualitative study, the researchers conducted in-depth interviews with 21 collegiate esports leaders from the U.S. and Canada. Eight of the study participants were involved in varsity-level esports, such as coaches or administrators, while the remaining 13 participants were presidents of collegiate esports clubs. Six of the participants identified as women; 15 identified as men.
    “Essentially, we found that women are effectively pushed out of esports at many colleges when they start investing financial resources in esports programs,” says Bryce Stout, co-author of the study and a Ph.D. student at NC State. “We thought collegiate esports might help to address the disenfranchisement of women in esports and in gaming more generally; instead, it seems to simply be an extension of that disenfranchisement.”
    “Higher education has been spending increasing amounts of time, money and effort on professionalizing esports programs,” Taylor says. “With some key exceptions, these institutions are clearly not putting as much effort into encouraging diversity in these programs. That effectively cuts out women and minorities.
    “Some leaders stress that they will welcome any player onto their team, as long as the player has a certain skill level,” Taylor says. “But this ignores the systemic problems that effectively drive most women out of gaming — such as harassment. There needs to be a focus on cultivating skill and developing players, rather than focusing exclusively on recruitment.”

    Story Source:
    Materials provided by North Carolina State University. Note: Content may be edited for style and length. More

  • in

    Transistor-integrated cooling for a more powerful chip

    Managing the heat generated in electronics is a huge problem, especially with the constant push to reduce the size and pack as many transistors as possible in the same chip. The whole problem is how to manage such high heat fluxes efficiently. Usually electronic technologies, designed by electrical engineers, and cooling systems, designed by mechanical engineers, are done independently and separately. But now EPFL researchers have quietly revolutionized the process by combining these two design steps into one: they’ve developed an integrated microfluidic cooling technology together with the electronics, that can efficiently manage the large heat fluxes generated by transistors. Their research, which has been published in Nature, will lead to even more compact electronic devices and enable the integration of power converters, with several high-voltage devices, into a single chip.
    The best of both worlds
    In this ERC-funded project, Professor Elison Matioli, his doctoral student Remco Van Erp, and their team from the School of Engineering’s Power and Wide-band-gap Electronics Research Laboratory (POWERLAB), began working to bring about a real change in mentality when it comes to designing electronic devices, by conceiving the electronics and cooling together, right from the beginning, aiming to extract the heat very near the regions that heat up the most in the device. “We wanted to combine skills in electrical and mechanical engineering in order to create a new kind of device,” says Van Erp.
    The team was looking to solve the issue of how to cool electronic devices, and especially transistors. “Managing the heat produced by these devices is one of the biggest challenges in electronics going forward,” says Elison Matioli. “It’s becoming increasingly important to minimize the environmental impact, so we need innovative cooling technologies that can efficiently process the large amounts of heat produced in a sustainable and cost-effective way.”
    Microfluidic channels and hot spots
    Their technology is based on integrating microfluidic channels inside the semiconductor chip, together with the electronics, so a cooling liquid flows inside an electronic chip. “We placed microfluidic channels very close to the transistor’s hot spots, with a straightforward and integrated fabrication process, so that we could extract the heat in exactly the right place and prevent it from spreading throughout the device,” says Matioli. The cooling liquid they used was deionized water, which doesn’t conduct electricity. “We chose this liquid for our experiments, but we’re already testing other, more effective liquids so that we can extract even more heat out of the transistor,” says Van Erp.
    Reducing energy consumption
    “This cooling technology will enable us to make electronic devices even more compact and could considerably reduce energy consumption around the world,” says Matioli. “We’ve eliminated the need for large external heat sinks and shown that it’s possible to create ultra-compact power converters in a single chip. This will prove useful as society becomes increasingly reliant on electronics.” The researchers are now looking at how to manage heat in other devices, such as lasers and communications systems.

    Story Source:
    Materials provided by Ecole Polytechnique Fédérale de Lausanne. Original written by Valérie Geneux. Note: Content may be edited for style and length. More

  • in

    AI used to show how hydrogen becomes a metal inside giant planets

    Dense metallic hydrogen — a phase of hydrogen which behaves like an electrical conductor — makes up the interior of giant planets, but it is difficult to study and poorly understood. By combining artificial intelligence and quantum mechanics, researchers have found how hydrogen becomes a metal under the extreme pressure conditions of these planets.
    The researchers, from the University of Cambridge, IBM Research and EPFL, used machine learning to mimic the interactions between hydrogen atoms in order to overcome the size and timescale limitations of even the most powerful supercomputers. They found that instead of happening as a sudden, or first-order, transition, the hydrogen changes in a smooth and gradual way. The results are reported in the journal Nature.
    Hydrogen, consisting of one proton and one electron, is both the simplest and the most abundant element in the Universe. It is the dominant component of the interior of the giant planets in our solar system — Jupiter, Saturn, Uranus, and Neptune — as well as exoplanets orbiting other stars.
    At the surfaces of giant planets, hydrogen remains a molecular gas. Moving deeper into the interiors of giant planets however, the pressure exceeds millions of standard atmospheres. Under this extreme compression, hydrogen undergoes a phase transition: the covalent bonds inside hydrogen molecules break, and the gas becomes a metal that conducts electricity.
    “The existence of metallic hydrogen was theorised a century ago, but what we haven’t known is how this process occurs, due to the difficulties in recreating the extreme pressure conditions of the interior of a giant planet in a laboratory setting, and the enormous complexities of predicting the behaviour of large hydrogen systems,” said lead author Dr Bingqing Cheng from Cambridge’s Cavendish Laboratory.
    Experimentalists have attempted to investigate dense hydrogen using a diamond anvil cell, in which two diamonds apply high pressure to a confined sample. Although diamond is the hardest substance on Earth, the device will fail under extreme pressure and high temperatures, especially when in contact with hydrogen, contrary to the claim that a diamond is forever. This makes the experiments both difficult and expensive.
    Theoretical studies are also challenging: although the motion of hydrogen atoms can be solved using equations based on quantum mechanics, the computational power needed to calculate the behaviour of systems with more than a few thousand atoms for longer than a few nanoseconds exceeds the capability of the world’s largest and fastest supercomputers.
    It is commonly assumed that the transition of dense hydrogen is first-order, which is accompanied by abrupt changes in all physical properties. A common example of a first-order phase transition is boiling liquid water: once the liquid becomes a vapour, its appearance and behaviour completely change despite the fact that the temperature and the pressure remain the same.
    In the current theoretical study, Cheng and her colleagues used machine learning to mimic the interactions between hydrogen atoms, in order to overcome limitations of direct quantum mechanical calculations.
    “We reached a surprising conclusion and found evidence for a continuous molecular to atomic transition in the dense hydrogen fluid, instead of a first-order one,” said Cheng, who is also a Junior Research Fellow at Trinity College.
    The transition is smooth because the associated ‘critical point’ is hidden. Critical points are ubiquitous in all phase transitions between fluids: all substances that can exist in two phases have critical points. A system with an exposed critical point, such as the one for vapour and liquid water, has clearly distinct phases. However, the dense hydrogen fluid, with the hidden critical point, can transform gradually and continuously between the molecular and the atomic phases. Furthermore, this hidden critical point also induces other unusual phenomena, including density and heat capacity maxima.
    The finding about the continuous transition provides a new way of interpreting the contradicting body of experiments on dense hydrogen. It also implies a smooth transition between insulating and metallic layers in giant gas planets. The study would not be possible without combining machine learning, quantum mechanics, and statistical mechanics. Without any doubt, this approach will uncover more physical insights about hydrogen systems in the future. As the next step, the researchers aim to answer the many open questions concerning the solid phase diagram of dense hydrogen. More