More stories

  • in

    Reactive Video playback that you control with your body

    Computer scientists have developed an entirely new way of interacting with video content that adapts to, and is controlled by, your body movement.
    Fitness videos and other instructional content that aims to teach viewers new martial arts skills, exercises or yoga positions have been popular since VHS in the 80s and are abundant on Internet platforms like YouTube.
    However, these traditional forms of instructional videos can lead to frustration, and even the potential for physical strain, as novice viewers, or those with limited physical mobility, struggle to keep up and mimic the movements of the expert instructors.
    Now an international team of researchers from Lancaster University, Stanford University and FXPAL, have created a solution that dynamically adapts to mirror the position of the viewer’s body and matches the speed of video playback to the viewer’s movements.
    The system, called ‘Reactive Video’, uses a Microsoft Kinect sensor, the latest in skeleton-tracking software, and probabilistic algorithms to identify the position, and movement of joints and limbs — such as elbows, knees, arms, hands, hips and legs. By working out the viewer’s movements it can match and compare this with the movement of the instructor in the video footage. It then estimates the time the user will take to perform a movement and adjusts playback of the video to the correct position, and pace, of the viewer.
    As well as providing a more immersive experience, Reactive Video also helps users to more accurately mimic and learn new movements.

    advertisement

    The researchers tested the system on study participants performing tai chi and radio exercises — a form of callisthenics popular in Japan. The results from the study showed that both systems could adapt to the users’ movements.
    Dr Christopher Clarke, researcher from Lancaster University and co-author on the paper, said: “Since the 1980s, and especially now with the Internet, videos have helped people stay active and have offered a cheaper, more convenient alternative to gym memberships and personal trainers. However, traditional video players do have limitations — they can’t provide feedback, or adapt the pace and intensity of the physical movement to the user.
    “We know performing movements in slow motion is beneficial for learning by providing opportunities to analyse your movements, and developing timing. We also know it can result in less physical strain for inexperienced users.
    “For some people, keeping pace can be tricky — especially when learning something new, and for older people or those with impaired movement. Also, constantly reaching for a remote to pause, rewind and replay, can be frustrating and breaks the immersion.
    “Our system overcomes these issues by having the video automatically adjust itself to play back at the user’s speed, which is less stressful and more beneficial for learning.”
    Don Kimber, co-author of the research, said: “Reactive Video acts and feels like a magic mirror where as you move the video mirrors your movement, but with a cleaned-up version of the procedure, or position, performed correctly by an expert for the user to mimic and learn from.”

    advertisement

    An additional benefit of Reactive Video, and something that sets it apart from exercise content developed for game consoles, is that it can be applied to existing footage of appropriate video content removing the need to create specially produced bespoke content.
    “By using this system we can post-process existing instructional video content and enhance it to dynamically adapt to users providing a fundamental shift in how we can potentially interact with videos,” said Dr Clarke.
    The team believe that with further research this kind of adaptive technology could be developed for sports and activities such as learning dance routines or honing golf swings.
    The Reactive Video system was presented at UIST2020, a leading academic conference for the field of Human Computer Interaction.
    It is detailed in the paper ‘Reactive Video: Adaptive Video Playback Based on User Motion for Supporting Physical Activity’.
    The study’s authors are Christopher Clarke, of Lancaster University; Doga Cavdir of Stanford University; and Patrick Chiu, Laurent Denoue and Don Kimber, of FXPAL. More

  • in

    Discovery suggests new promise for nonsilicon computer transistors

    For decades, one material has so dominated the production of computer chips and transistors that the tech capital of the world — Silicon Valley — bears its name. But silicon’s reign may not last forever.
    MIT researchers have found that an alloy called InGaAs (indium gallium arsenide) could hold the potential for smaller and more energy efficient transistors. Previously, researchers thought that the performance of InGaAs transistors deteriorated at small scales. But the new study shows this apparent deterioration is not an intrinsic property of the material itself.
    The finding could one day help push computing power and efficiency beyond what’s possible with silicon. “We’re really excited,” said Xiaowei Cai, the study’s lead author. “We hope this result will encourage the community to continue exploring the use of InGaAs as a channel material for transistors.”
    Cai, now with Analog Devices, completed the research as a PhD student in the MIT Microsystems Technology Laboratories and Department of Electrical Engineering and Computer Science (EECS), with Donner Professor Jesús del Alamo. Her co-authors include Jesús Grajal of Polytechnic University of Madrid, as well as MIT’s Alon Vardi and del Alamo. The paper will be presented this month at the virtual IEEE International Electron Devices Meeting.
    Transistors are the building blocks of a computer. Their role as switches, either halting electric current or letting it flow, gives rise to a staggering array of computations — from simulating the global climate to playing cat videos on Youtube. A single laptop could contain billions of transistors. For computing power to improve in the future, as it has for decades, electrical engineers will have to develop smaller, more tightly packed transistors. To date, silicon has been the semiconducting material of choice for transistors. But InGaAs has shown hints of becoming a potential competitor.
    Electrons can zip through InGaAs with ease, even at low voltage. The material is “known to have great [electron] transport properties,” says Cai. InGaAs transistors can process signals quickly, potentially resulting in speedier calculations. Plus, InGaAs transistors can operate at relatively low voltage, meaning they could enhance a computer’s energy efficiency. So InGaAs might seem like a promising material for computer transistors. But there’s a catch.
    InGaAs’ favorable electron transport properties seem to deteriorate at small scales — the scales needed to build faster and denser computer processors. The problem has led some researchers to conclude that nanoscale InGaAs transistors simply aren’t suited for the task. But, says Cai, “we have found that that’s a misconception.”
    The team discovered that InGaAs’ small-scale performance issues are due in part to oxide trapping. This phenomenon causes electrons to get stuck while trying to flow through a transistor. “A transistor is supposed to work as a switch. You want to be able to turn a voltage on and have a lot of current,” says Cai. “But if you have electrons trapped, what happens is you turn a voltage on, but you only have a very limited amount of current in the channel. So the switching capability is a lot lower when you have that oxide trapping.”
    Cai’s team pinpointed oxide trapping as the culprit by studying the transistor’s frequency dependence — the rate at which electric pulses are sent through the transistor. At low frequencies, the performance of nanoscale InGaAs transistors appeared degraded. But at frequencies of 1 gigahertz or greater, they worked just fine — oxide trapping was no longer a hindrance. “When we operate these devices at really high frequency, we noticed that the performance is really good,” she says. “They’re competitive with silicon technology.”
    Cai hopes her team’s discovery will give researchers new reason to pursue InGaAs-based computer transistors. The work shows that “the problem to solve is not really the InGaAs transistor itself. It’s this oxide trapping issue,” she says. “We believe this is a problem that can be solved or engineered out of.” She adds that InGaAs has shown promise in both classical and quantum computing applications.
    “This [research] area remains very, very exciting,” says del Alamo. “We thrive on pushing transistors to the extreme of performance.” One day, that extreme performance could come courtesy of InGaAs.
    This research was supported in part by the Defense Threat Reduction Agency and the National Science Foundation.

    Story Source:
    Materials provided by Massachusetts Institute of Technology. Original written by Daniel Ackerman. Note: Content may be edited for style and length. More

  • in

    New study tests machine learning on detection of borrowed words in world languages

    Lexical borrowing, or the direct transfer of words from one language to another, has interested scholars for millennia, as evidenced already in Plato’s Kratylos dialogue, in which Socrates discusses the challenge imposed by borrowed words on etymological studies. In historical linguistics, lexical borrowings help researchers trace the evolution of modern languages and indicate cultural contact between distinct linguistic groups — whether recent or ancient. However, the techniques for identifying borrowed words have resisted formalization, demanding that researchers rely on a variety of proxy information and the comparison of multiple languages.
    “The automated detection of lexical borrowings is still one of the most difficult tasks we face in computational historical linguistics,” says Johann-Mattis List, who led the study.
    In the current study, researchers from PUCP and MPI-SHH employed different machine learning techniques to train language models that mimic the way in which linguists identify borrowings when considering only the evidence provided by a single language: if sounds or the ways in which sounds combine to form words are atypical when comparing them with other words in the same language, this often hints to recent borrowings. The models were then applied to a modified version of the World Loanword Database, a catalog of borrowing information for a sample of 40 languages from different language families all over the world, in order to see how accurately words within a given language would be classified as borrowed or not by the different techniques.
    In many cases the results were unsatisfying, suggesting that loanword detection is too difficult for machine learning methods most commonly used. However, in specific situations, such as in lists with a high proportion of loanwords or in languages whose loanwords come primarily from a single donor language, the teams’ lexical language models showed some promise.
    “After these first experiments with monolingual lexical borrowings, we can proceed to stake out other aspects of the problem, moving into multilingual and cross-linguistic approaches,” says John Miller of PUCP, the study’s co-lead author.
    “Our computer-assisted approach, along with the dataset we are releasing, will shed a new light on the importance of computer-assisted methods for language comparison and historical linguistics,” adds Tiago Tresoldi, the study’s other co-lead author from MPI-SHH.
    The study joins ongoing efforts to tackle one of the most challenging problems in historical linguistics, showing that loanword detection cannot rely on mono-lingual information alone. In the future, the authors hope to develop better-integrated approaches that take multi-lingual information into account.

    Story Source:
    Materials provided by Max Planck Institute for the Science of Human History. Note: Content may be edited for style and length. More

  • in

    Engineering discovery challenges heat transfer paradigm that guides electronic and photonic device design

    A research breakthrough from the University of Virginia School of Engineering demonstrates a new mechanism to control temperature and extend the lifetime of electronic and photonic devices such as sensors, smart phones and transistors.
    The discovery, from UVA’s experiments and simulations in thermal engineering research group, challenges a fundamental assumption about heat transfer in semiconductor design. In devices, electrical contacts form at the junction of a metal and a semiconducting material. Traditionally, materials and device engineers have assumed that electron energy moves across this junction through a process called charge injection, said group leader Patrick Hopkins, professor of mechanical and aerospace engineering with courtesy appointments in materials science and engineering and physics.
    Charge injection posits that with the flow of the electrical charge, electrons physically jump from the metal into the semiconductor, taking their excess heat with them. This changes the electrical composition and properties of the insulating or semiconducting materials. The cooling that goes hand-in-hand with charge injection can significantly degrade device efficiency and performance.
    Hopkins’ group discovered a new heat transfer path that embraces the benefits of cooling associated with charge injection without any of the drawbacks of the electrons physically moving into the semiconductor device. They call this mechanism ballistic thermal injection.
    As described by Hopkins’ advisee John Tomko, a Ph.D. student of materials science and engineering: “The electron gets to the bridge between its metal and the semiconductor, sees another electron across the bridge and interacts with it, transferring its heat but staying on its own side of the bridge. The semiconducting material absorbs a lot of heat, but the number of electrons remains constant.”
    “The ability to cool electrical contacts by keeping charge densities constant offers a new direction in electronic cooling without impacting the electrical and optical performance of the device,” Hopkins said. “The ability to independently optimize optical, electrical and thermal behavior of materials and devices improves device performance and longevity.”
    Tomko’s expertise in laser metrology — measuring energy transfer at the nanoscale — revealed ballistic thermal injection as a new path for device self-cooling. Tomko’s measurement technique, more specifically optical laser spectroscopy, is an entirely new way to measure heat transfer across the metal-semiconductor interface.

    advertisement

    “Previous methods of measurement and observation could not decompose the heat transfer mechanism separately from charge injection,” Tomko said.
    For their experiments, Hopkins’ research team selected cadmium oxide, a transparent electricity-conducting oxide that looks like glass. Cadmium oxide was a pragmatic choice because its unique optical properties are well suited to Tomko’s laser spectroscopy measurement method.
    Cadmium oxide perfectly absorbs mid-infrared photons in the form of plasmons, quasiparticles composed of synchronized electrons that are an incredibly efficient way of coupling light into a material. Tomko used ballistic thermal injection to move the light wavelength at which perfect absorption occurs, essentially tuning the optical properties of cadmium oxide through injected heat.
    “Our observations of tuning enable us to say definitively that heat transfer happens without swapping electrons,” Tomko said.
    Tomko probed the plasmons to extract information on the number of free electrons on each side of the bridge between the metal and the semiconductor. In this way, Tomko captured the measurement of electrons’ placement before and after the metal was heated and cooled.

    advertisement

    The team’s discovery offers promise for infrared sensing technologies as well. Tomko’s observations reveal that the optical tuning lasts as long as the cadmium oxide remains hot, keeping in mind that time is relative — a trillionth rather than a quadrillionth of a second.
    Ballistic thermal injection can control plasmon absorption and therefore the optical response of non-metal materials. Such control enables highly efficient plasmon absorption at mid-infrared length. One benefit of this development is that night vision devices can be made more responsive to a sudden, intense change in heat that would otherwise leave the device temporarily blind.
    “The realization of this ballistic thermal injection process across metal/cadmium oxide interfaces for ultrafast plasmonic applications opens the door for us to use this process for efficient cooling of other device-relevant material interfaces,” Hopkins said.
    Tomko first-authored a paper documenting these findings. Nature Nanotechnology published the team’s paper, Long-lived Modulation of Plasmonic Absorption by Ballistic Thermal Injection, on November 9; the paper was also promoted in the journal editors’ News and Views. The Nature Nanotechnology paper adds to a long list of publications for Tomko, who has co-authored more than 30 papers and can now claim first-authorship of two Nature Nanotechnology papers as a graduate student.
    The research paper culminates a two-year, collaborative effort funded by a U.S. Army Research Office Multi-University Research Initiative. Jon-Paul Maria, professor of materials science and engineering at Penn State University, is the principal investigator for the MURI grant, which includes the University of Southern California as well as UVA. This MURI team also collaborated with Josh Caldwell, associate professor of mechanical engineering and electrical engineering at Vanderbilt University.
    The team’s breakthrough relied on Penn State’s expertise in making the cadmium oxide samples, Vanderbilt’s expertise in optical modeling, the University of Southern California’s computational modeling, and UVA’s expertise in energy transport, charge flow, and photonic interactions with plasmons at heterogeneous interfaces, including the development of a novel ultrafast-pump-probe laser experiment to monitor this novel ballistic thermal injection process. More

  • in

    New tools 'turn on' quantum gases of ultracold molecules

    JILA researchers have developed tools to “turn on” quantum gases of ultracold molecules, gaining control of long-distance molecular interactions for potential applications such as encoding data for quantum computing and simulations.
    The new scheme for nudging a molecular gas down to its lowest energy state, called quantum degeneracy, while suppressing chemical reactions that break up molecules finally makes it possible to explore exotic quantum states in which all the molecules interact with one another.
    The research is described in the Dec. 10 issue of Nature. JILA is a joint institute of the National Institute of Standards and Technology (NIST) and the University of Colorado Boulder.
    “Molecules are always celebrated for their long-range interactions, which can give rise to exotic quantum physics and novel control in quantum information science,” NIST/JILA Fellow Jun Ye said. “However, until now, nobody had figured out how to turn on these long-range interactions in a bulk gas.”
    “Now, all this has changed. Our work showed for the first time that we can turn on an electric field to manipulate molecular interactions, get them to cool down further, and start to explore collective physics where all molecules are coupled to each other.”
    The new work follows up on Ye’s many previous achievements with ultracold quantum gases. Researchers have long sought to control ultracold molecules in the same way they can control atoms. Molecules offer additional means of control, including polarity — that is, opposing electrical charges — and many different vibrations and rotations.

    advertisement

    The JILA experiments created a dense gas of about 20,000 trapped potassium-rubidium molecules at a temperature of 250 nanokelvin above absolute zero (about minus 273 degrees Celsius or minus 459 degrees Fahrenheit). Crucially, these molecules are polar, with a positive electric charge at the rubidium atom and a negative charge at the potassium atom. The differences between these positive and negative charges, called electric dipole moments, cause the molecules to behave like tiny compass magnets sensitive to certain forces, in this case electric fields.
    When the gas is cooled to near absolute zero, the molecules stop behaving like particles and instead behave like waves that overlap. The molecules stay apart because they are fermions, a class of particles that cannot be in the same quantum state and location at the same time and therefore repel each other. But they can interact at long range through their overlapping waves, electric dipole moments and other features.
    In the past, JILA researchers created quantum gases of molecules by manipulating a gas containing both types of atoms with a magnetic field and lasers. This time the researchers first loaded the mixture of gaseous atoms into a vertical stack of thin, pancake-shaped traps formed from laser light (called an optical lattice), tightly confining the atoms along the vertical direction. Researchers then used magnetic fields and lasers to bond pairs of atoms together into molecules. Leftover atoms were heated and removed by tuning a laser to excite motion unique to each type of atom.
    Then, with the molecular cloud positioned at the center of a new six-electrode assembly formed by two glass plates and four tungsten rods, researchers generated a tunable electric field.
    The electric field set off repulsive interactions among the molecules that stabilized the gas, reducing inelastic (“bad”) collisions in which the molecules undergo a chemical reaction and escape from the trap. This technique boosted rates of elastic (“good”) interactions more than a hundredfold while suppressing chemical reactions.
    This environment allowed efficient evaporative cooling of the gas down to a temperature below the onset of quantum degeneracy. The cooling process removed the hottest molecules from the lattice trap and allowed the remaining molecules to adjust to a lower temperature through the elastic collisions. Slowly turning on a horizontal electric field over hundreds of milliseconds reduced the trap strength in one direction, long enough for hot molecules to escape and the remaining molecules to cool down. At the end of this process, the molecules returned to their most stable state but now in a denser gas.
    The new JILA method can be applied to make ultracold gases out of other types of polar molecules.
    Ultracold molecular gases may have many practical uses, including new methods for quantum computing using polar molecules as quantum bits; simulations and improved understanding of quantum phenomena such as colossal magnetoresistance (for improved data storage and processing) and superconductivity (for perfectly efficient electric power transmission); and new tools for precision measurement such as molecular clocks or molecular systems that enable searches for new theories of physics.
    Funding was provided by NIST, the Defense Advanced Research Projects Agency, the Army Research Office, and the National Science Foundation. More

  • in

    In the past 15 years, climate change has transformed the Arctic

    Fifteen years of grading warming’s impact on the Arctic has made one thing abundantly clear: Climate change has drastically altered the Arctic in that short time period.
    Breaking unfortunate records is “like whack-a-mole,” says Jackie Richter-Menge, a climate scientist at the University of Alaska Fairbanks and an editor of the 2020 Arctic Report Card, released December 8 at the virtual meeting of the American Geophysical Union. From sea ice lows to temperature highs, records keep popping up all over the place. For instance, in June, a record-high 38° Celsius (100.4° Fahrenheit) temperature was recorded in the Arctic Circle (SN:6/23/20). And in 2018, winter ice on the Bering Sea shrank to a 5,500 year low (SN:9/3/20).
    “But quite honestly, the biggest headline is the persistence and robustness of the warming,” Richter-Menge says. In 2007, only a year after the first Arctic Report Card, summer sea ice reached a record low, shrinking to an area 1.6 million square kilometers smaller than the previous year. Then, only five years later, the report card noted a new low, 18 percent below 2007. In 2020, sea ice didn’t set a record but not for lack of trying: It still was the second lowest on record in the last 42 years.  
    “The transformation of the Arctic to a warmer, less frozen and biologically changed region is well under way,” the report concludes. And it’s changing faster than expected when researchers launched the report card in 2006. The annual average air temperature in the Arctic is rising two to three times faster than the rest of the globe, Richter-Menge says. Over the last 20 years, it’s warmed at a rate of 0.77 degrees C per decade, compared with the global average of 0.29 degrees C per decade.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Improvements in research techniques over the last 15 years have helped researchers more thoroughly observe warming’s impact and how different aspects of Arctic climate change are linked to one another, she says. These improvements include the ability to measure ice mass via gravity measurements taken by the Gravity Recovery and Climate Experiment (GRACE) satellite. Other satellites have provided additional observations from above while on-the-ground measurements, such as by the Multidisciplinary drifting Observatory for the Study of Arctic Climate (MOSAiC), have provided up-close sea ice measurements (SN:4/8/20). The report has also begun to include on-the-ground observations of the Arctic’s Indigenous people, who experiences these changes directly (SN:12/11/19).
    The changes have revealed few bright spots but one is the rebound of bowhead whales, which were hunted almost to extinction around the turn of the 20th century. While researchers are careful to note that the whales are still vulnerable, the four populations of the whales (Balaena mysticetus) now range from 218 in the Okhotsk Sea to around 16,800 in the Bering, Chukchi and Beaufort seas. Researchers suggest that the whales’ rebound is due, at least in part, to the warming that has occurred over the last 30 years. Earlier sea ice melting and warmer surface water means more krill and other food for these baleen feeders.
    In a rare bit of good news from the Arctic, researchers report that bowhead whales are on the rebound. Unfortunately, the same warming that has aided the whales has melted ice sheets and increased hardships for Indigenous hunters.Nature Picture Library/Alamy Stock Photo
    But don’t be fooled. The potential good news is overshadowed by the bad news. There’s been “this accumulation of knowledge and insights that we’ve gained over 15 years,” says Mark Serreze, a climate scientist at the National Snow and Ice Data Center in Boulder, Colo., who wasn’t involved in this year’s report. The 2020 research is “an exclamation point on the changes that have been unfolding,” he says. “The bowhead whales are doing OK, but that’s about it.” More

  • in

    Hidden symmetry could be key to more robust quantum systems

    Researchers have found a way to protect highly fragile quantum systems from noise, which could aid in the design and development of new quantum devices, such as ultra-powerful quantum computers.
    The researchers, from the University of Cambridge, have shown that microscopic particles can remain intrinsically linked, or entangled, over long distances even if there are random disruptions between them. Using the mathematics of quantum theory, they discovered a simple setup where entangled particles can be prepared and stabilised even in the presence of noise by taking advantage of a previously unknown symmetry in quantum systems.
    Their results, reported in the journal Physical Review Letters, open a new window into the mysterious quantum world that could revolutionise future technology by preserving quantum effects in noisy environments, which is the single biggest hurdle for developing such technology. Harnessing this capability will be at the heart of ultrafast quantum computers.
    Quantum systems are built on the peculiar behaviour of particles at the atomic level and could revolutionise the way that complex calculations are performed. While a normal computer bit is an electrical switch that can be set to either one or zero, a quantum bit, or qubit, can be set to one, zero, or both at the same time. Furthermore, when two qubits are entangled, an operation on one immediately affects the other, no matter how far apart they are. This dual state is what gives a quantum computer its power. A computer built with entangled qubits instead of normal bits could perform calculations well beyond the capacities of even the most powerful supercomputers.
    “However, qubits are extremely finicky things, and the tiniest bit of noise in their environment can cause their entanglement to break,” said Dr Shovan Dutta from Cambridge’s Cavendish Laboratory, the paper’s first author. “Until we can find a way to make quantum systems more robust, their real-world applications will be limited.”
    Several companies — most notably, IBM and Google — have developed working quantum computers, although so far these have been limited to less than 100 qubits. They require near-total isolation from noise, and even then, have very short lifetimes of a few microseconds. Both companies have plans to develop 1000 qubit quantum computers within the next few years, although unless the stability issues are overcome, quantum computers will not reach practical use.
    Now, Dutta and his co-author Professor Nigel Cooper have discovered a robust quantum system where multiple pairs of qubits remain entangled even with a lot of noise.
    They modelled an atomic system in a lattice formation, where atoms strongly interact with each other, hopping from one site of the lattice to another. The authors found if noise were added in the middle of the lattice, it didn’t affect entangled particles between left and right sides. This surprising feature results from a special type of symmetry that conserves the number of such entangled pairs.
    “We weren’t expecting this stabilised type of entanglement at all,” said Dutta. “We stumbled upon this hidden symmetry, which is very rare in these noisy systems.”
    They showed this hidden symmetry protects the entangled pairs and allows their number to be controlled from zero to a large maximum value. Similar conclusions can be applied to a broad class of physical systems and can be realised with already existing ingredients in experimental platforms, paving the way to controllable entanglement in a noisy environment.
    “Uncontrolled environmental disturbances are bad for survival of quantum effects like entanglement, but one can learn a lot by deliberately engineering specific types of disturbances and seeing how the particles respond,” said Dutta. “We’ve shown that a simple form of disturbance can actually produce — and preserve — many entangled pairs, which is a great incentive for experimental developments in this field.”
    The researchers are hoping to confirm their theoretical findings with experiments within the next year. More

  • in

    Stretchable micro-supercapacitors to self-power wearable devices

    A stretchable system that can harvest energy from human breathing and motion for use in wearable health-monitoring devices may be possible, according to an international team of researchers, led by Huanyu “Larry” Cheng, Dorothy Quiggle Career Development Professor in Penn State’s Department of Engineering Science and Mechanics.
    The research team, with members from Penn State and Minjiang University and Nanjing University, both in China, recently published its results in Nano Energy.
    According to Cheng, current versions of batteries and supercapacitors powering wearable and stretchable health-monitoring and diagnostic devices have many shortcomings, including low energy density and limited stretchability.
    “This is something quite different than what we have worked on before, but it is a vital part of the equation,” Cheng said, noting that his research group and collaborators tend to focus on developing the sensors in wearable devices. “While working on gas sensors and other wearable devices, we always need to combine these devices with a battery for powering. Using micro-supercapacitors gives us the ability to self-power the sensor without the need for a battery.”
    An alternative to batteries, micro-supercapacitors are energy storage devices that can complement or replace lithium-ion batteries in wearable devices. Micro-supercapacitors have a small footprint, high power density, and the ability to charge and discharge quickly. However, according to Cheng, when fabricated for wearable devices, conventional micro-supercapacitors have a “sandwich-like” stacked geometry that displays poor flexibility, long ion diffusion distances and a complex integration process when combined with wearable electronics.
    This led Cheng and his team to explore alternative device architectures and integration processes to advance the use of micro-supercapacitors in wearable devices. They found that arranging micro-supercapacitor cells in a serpentine, island-bridge layout allows the configuration to stretch and bend at the bridges, while reducing deformation of the micro-supercapacitors — the islands. When combined, the structure becomes what the researchers refer to as “micro-supercapacitors arrays.”
    “By using an island-bridge design when connecting cells, the micro-supercapacitor arrays displayed increased stretchability and allowed for adjustable voltage outputs,” Cheng said. “This allows the system to be reversibly stretched up to 100%.”
    By using non-layered, ultrathin zinc-phosphorus nanosheets and 3D laser-induced graphene foam — a highly porous, self-heating nanomaterial — to construct the island-bridge design of the cells, Cheng and his team saw drastic improvements in electric conductivity and the number of absorbed charged ions. This proved that these micro-supercapacitor arrays can charge and discharge efficiently and store the energy needed to power a wearable device.
    The researchers also integrated the system with a triboelectric nanogenerator, an emerging technology that converts mechanical movement to electrical energy. This combination created a self-powered system.
    “When we have this wireless charging module that’s based on the triboelectric nanogenerator, we can harvest energy based on motion, such as bending your elbow or breathing and speaking,” Cheng said. “We are able to use these everyday human motions to charge the micro-supercapacitors.”
    By combining this integrated system with a graphene-based strain sensor, the energy-storing micro-supercapacitor arrays — charged by the triboelectric nanogenerators — are able to power the sensor, Cheng said, showing the potential for this system to power wearable, stretchable devices.

    Story Source:
    Materials provided by Penn State. Original written by Tessa M. Pick. Note: Content may be edited for style and length. More