More stories

  • in

    Cosmic rays may soon stymie quantum computing

    The practicality of quantum computing hangs on the integrity of the quantum bit, or qubit.
    Qubits, the logic elements of quantum computers, are coherent two-level systems that represent quantum information. Each qubit has the strange ability to be in a quantum superposition, carrying aspects of both states simultaneously, enabling a quantum version of parallel computation. Quantum computers, if they can be scaled to accommodate many qubits on one processor, could be dizzyingly faster, and able to handle far more complex problems, than today’s conventional computers.
    But that all depends on a qubit’s integrity, or how long it can operate before its superposition and the quantum information are lost — a process called decoherence, which ultimately limits the computer run-time. Superconducting qubits — a leading qubit modality today — have achieved exponential improvement in this key metric, from less than one nanosecond in 1999 to around 200 microseconds today for the best-performing devices.
    But researchers at MIT, MIT Lincoln Laboratory, and Pacific Northwest National Laboratory (PNNL) have found that a qubit’s performance will soon hit a wall. In a paper published in Nature, the team reports that the low-level, otherwise harmless background radiation that is emitted by trace elements in concrete walls and incoming cosmic rays are enough to cause decoherence in qubits. They found that this effect, if left unmitigated, will limit the performance of qubits to just a few milliseconds.
    Given the rate at which scientists have been improving qubits, they may hit this radiation-induced wall in just a few years. To overcome this barrier, scientists will have to find ways to shield qubits — and any practical quantum computers — from low-level radiation, perhaps by building the computers underground or designing qubits that are tolerant to radiation’s effects.
    “These decoherence mechanisms are like an onion, and we’ve been peeling back the layers for past 20 years, but there’s another layer that left unabated is going to limit us in a couple years, which is environmental radiation,” says William Oliver, associate professor of electrical engineering and computer science and Lincoln Laboratory Fellow at MIT. “This is an exciting result, because it motivates us to think of other ways to design qubits to get around this problem.”
    The paper’s lead author is Antti Vepsäläinen, a postdoc in MIT’s Research Laboratory of Electronics.

    advertisement

    “It is fascinating how sensitive superconducting qubits are to the weak radiation. Understanding these effects in our devices can also be helpful in other applications such as superconducting sensors used in astronomy,” Vepsäläinen says.
    Co-authors at MIT include Amir Karamlou, Akshunna Dogra, Francisca Vasconcelos, Simon Gustavsson, and physics professor Joseph Formaggio, along with David Kim, Alexander Melville, Bethany Niedzielski, and Jonilyn Yoder at Lincoln Laboratory, and John Orrell, Ben Loer, and Brent VanDevender of PNNL.
    A cosmic effect
    Superconducting qubits are electrical circuits made from superconducting materials. They comprise multitudes of paired electrons, known as Cooper pairs, that flow through the circuit without resistance and work together to maintain the qubit’s tenuous superposition state. If the circuit is heated or otherwise disrupted, electron pairs can split up into “quasiparticles,” causing decoherence in the qubit that limits its operation.
    There are many sources of decoherence that could destabilize a qubit, such as fluctuating magnetic and electric fields, thermal energy, and even interference between qubits.

    advertisement

    Scientists have long suspected that very low levels of radiation may have a similar destabilizing effect in qubits.
    “I the last five years, the quality of superconducting qubits has become much better, and now we’re within a factor of 10 of where the effects of radiation are going to matter,” adds Kim, a technical staff member at MIT Lincoln Laboratotry.
    So Oliver and Formaggio teamed up to see how they might nail down the effect of low-level environmental radiation on qubits. As a neutrino physicist, Formaggio has expertise in designing experiments that shield against the smallest sources of radiation, to be able to see neutrinos and other hard-to-detect particles.
    “Calibration is key”
    The team, working with collaborators at Lincoln Laboratory and PNNL, first had to design an experiment to calibrate the impact of known levels of radiation on superconducting qubit performance. To do this, they needed a known radioactive source — one which became less radioactive slowly enough to assess the impact at essentially constant radiation levels, yet quickly enough to assess a range of radiation levels within a few weeks, down to the level of background radiation.
    The group chose to irradiate a foil of high purity copper. When exposed to a high flux of neutrons, copper produces copious amounts of copper-64, an unstable isotope with exactly the desired properties.
    “Copper just absorbs neutrons like a sponge,” says Formaggio, who worked with operators at MIT’s Nuclear Reactor Laboratory to irradiate two small disks of copper for several minutes. They then placed one of the disks next to the superconducting qubits in a dilution refrigerator in Oliver’s lab on campus. At temperatures about 200 times colder than outer space, they measured the impact of the copper’s radioactivity on qubits’ coherence while the radioactivity decreased — down toward environmental background levels.
    The radioactivity of the second disk was measured at room temperature as a gauge for the levels hitting the qubit. Through these measurements and related simulations, the team understood the relation between radiation levels and qubit performance, one that could be used to infer the effect of naturally occurring environmental radiation. Based on these measurements, the qubit coherence time would be limited to about 4 milliseconds.
    “Not game over”
    The team then removed the radioactive source and proceeded to demonstrate that shielding the qubits from the environmental radiation improves the coherence time. To do this, the researchers built a 2-ton wall of lead bricks that could be raised and lowered on a scissor lift, to either shield or expose the refrigerator to surrounding radiation.
    “We built a little castle around this fridge,” Oliver says.
    Every 10 minutes, and over several weeks, students in Oliver’s lab alternated pushing a button to either lift or lower the wall, as a detector measured the qubits’ integrity, or “relaxation rate,” a measure of how the environmental radiation impacts the qubit, with and without the shield. By comparing the two results, they effectively extracted the impact attributed to environmental radiation, confirming the 4 millisecond prediction and demonstrating that shielding improved qubit performance.
    “Cosmic ray radiation is hard to get rid of,” Formaggio says. “It’s very penetrating, and goes right through everything like a jet stream. If you go underground, that gets less and less. It’s probably not necessary to build quantum computers deep underground, like neutrino experiments, but maybe deep basement facilities could probably get qubits operating at improved levels.”
    Going underground isn’t the only option, and Oliver has ideas for how to design quantum computing devices that still work in the face of background radiation.
    “If we want to build an industry, we’d likely prefer to mitigate the effects of radiation above ground,” Oliver says. “We can think about designing qubits in a way that makes them ‘rad-hard,’ and less sensitive to quasiparticles, or design traps for quasiparticles so that even if they’re constantly being generated by radiation, they can flow away from the qubit. So it’s definitely not game-over, it’s just the next layer of the onion we need to address.”
    This research was funded, in part, by the U.S. Department of Energy Office of Nuclear Physics, the U.S. Army Research Office, the U.S. Department of Defense, and the U.S. National Science Foundation. More

  • in

    Microscopic robots 'walk' thanks to laser tech

    A Cornell University-led collaboration has created the first microscopic robots that incorporate semiconductor components, allowing them to be controlled — and made to walk — with standard electronic signals.
    These robots, roughly the size of paramecium, provide a template for building even more complex versions that utilize silicon-based intelligence, can be mass produced, and may someday travel through human tissue and blood.
    The collaboration is led by Itai Cohen, professor of physics, Paul McEuen, the John A. Newman Professor of Physical Science and their former postdoctoral researcher Marc Miskin, who is now an assistant professor at the University of Pennsylvania.
    The walking robots are the latest iteration, and in many ways an evolution, of Cohen and McEuen’s previous nanoscale creations, from microscopic sensors to graphene-based origami machines.
    The new robots are about 5 microns thick (a micron is one-millionth of a meter), 40 microns wide and range from 40 to 70 microns in length. Each bot consists of a simple circuit made from silicon photovoltaics — which essentially functions as the torso and brain — and four electrochemical actuators that function as legs.
    The researchers control the robots by flashing laser pulses at different photovoltaics, each of which charges up a separate set of legs. By toggling the laser back and forth between the front and back photovoltaics, the robot walks.
    The robots are certainly high-tech, but they operate with low voltage (200 millivolts) and low power (10 nanowatts), and remain strong and robust for their size. Because they are made with standard lithographic processes, they can be fabricated in parallel: About 1 million bots fit on a 4-inch silicon wafer.
    The researchers are exploring ways to soup up the robots with more complicated electronics and onboard computation — improvements that could one day result in swarms of microscopic robots crawling through and restructuring materials, or suturing blood vessels, or being dispatched en masse to probe large swaths of the human brain.
    “Controlling a tiny robot is maybe as close as you can come to shrinking yourself down. I think machines like these are going to take us into all kinds of amazing worlds that are too small to see,” said Miskin, the study’s lead author.
    “This research breakthrough provides exciting scientific opportunity for investigating new questions relevant to the physics of active matter and may ultimately lead to futuristic robotic materials,” said Sam Stanton, program manager for the Army Research Office, an element of the Combat Capabilities Development Command’s Army Research Laboratory, which supported the research.

    Story Source:
    Materials provided by Cornell University. Original written by David Nutt. Note: Content may be edited for style and length. More

  • in

    Natural radiation can interfere with quantum computers

    A multidisciplinary research team has shown that radiation from natural sources in the environment can limit the performance of superconducting quantum bits, known as qubits. The discovery, reported today in the journal Nature, has implications for the construction and operation of quantum computers, an advanced form of computing that has attracted billions of dollars in public and private investment globally.
    The collaboration between teams at the U.S. Department of Energy’s Pacific Northwest National Laboratory (PNNL) and the Massachusetts Institute of Technology (MIT), helps explain a mysterious source of interference limiting qubit performance.
    “Our study is the first to show clearly that low-level ionizing radiation in the environment degrades the performance of superconducting qubits,” said John Orrell, a PNNL research physicist, senior author of the study and expert in low-level radiation measurement. “These findings suggest that radiation shielding will be necessary to attain long-sought performance in quantum computers of this design.”
    Natural radiation wreaks havoc with computers
    Computer engineers have known for at least a decade that natural radiation emanating from materials like concrete and pulsing through our atmosphere in the form of cosmic rays can cause digital computers to malfunction. But digital computers aren’t nearly as sensitive as a quantum computer.
    “We found that practical quantum computing with these devices will not be possible unless we address the radiation issue,” said PNNL physicist Brent VanDevender, a co-investigator on the study.

    advertisement

    The researchers teamed up to solve a puzzle that has been vexing efforts to keep superconducting quantum computers working for long enough to make them reliable and practical. A working quantum computer would be thousands of times faster than even the fastest supercomputer operating today. And it would be able to tackle computing challenges that today’s digital computers are ill-equipped to take on. But the immediate challenge is to have the qubits maintain their state, a feat called “coherence,” said Orrell. This desirable quantum state is what gives quantum computers their power.
    MIT physicist Will Oliver was working with superconducting qubits and became perplexed at a source of interference that helped push the qubits out of their prepared state, leading to “decoherence,” and making them non-functional. After ruling out a number of different possibilities, he considered the idea that natural radiation from sources like metals found in the soil and cosmic radiation from space might be pushing the qubits into decoherence.
    A chance conversation between Oliver, VanDevender, and his long-time collaborator, MIT physicist Joe Formaggio, led to the current project.
    It’s only natural
    To test the idea, the research team measured the performance of prototype superconducting qubits in two different experiments:
    They exposed the qubits to elevated radiation from copper metal activated in a reactor.
    They built a shield around the qubits that lowered the amount of natural radiation in their environment.
    The pair of experiments clearly demonstrated the inverse relationship between radiation levels and length of time qubits remain in a coherent state.

    advertisement

    “The radiation breaks apart matched pairs of electrons that typically carry electric current without resistance in a superconductor,” said VanDevender. “The resistance of those unpaired electrons destroys the delicately prepared state of a qubit.”
    The findings have immediate implications for qubit design and construction, the researchers concluded. For example, the materials used to construct quantum computers should exclude material that emits radiation, the researchers said. In addition, it may be necessary to shield experimental quantum computers from radiation in the atmosphere.
    At PNNL, interest has turned to whether the Shallow Underground Laboratory, which reduces surface radiation exposure by 99%, could serve future quantum computer development. Indeed, a recent study by a European research team corroborates the improvement in qubit coherence when experiments are conducted underground.
    “Without mitigation, radiation will limit the coherence time of superconducting qubits to a few milliseconds, which is insufficient for practical quantum computing,” said VanDevender.
    The researchers emphasize that factors other than radiation exposure are bigger impediments to qubit stability for the moment. Things like microscopic defects or impurities in the materials used to construct qubits are thought to be primarily responsible for the current performance limit of about one-tenth of a millisecond. But once those limitations are overcome, radiation begins to assert itself as a limit and will eventually become a problem without adequate natural radiation shielding strategies, the researchers said.
    Findings affect global search for dark matter
    In addition to helping explain a source of qubit instability, the research findings may also have implications for the global search for dark matter, which is thought to comprise just under 85% of the known universe, but which has so far escaped human detection with existing instruments. One approach to signals involves using research that depends on superconducting detectors of similar design to qubits. Dark matter detectors also need to be shielded from external sources of radiation, because radiation can trigger false recordings that obscure the desirable dark matter signals.
    “Improving our understanding of this process may lead to improved designs for these superconducting sensors and lead to more sensitive dark matter searches,” said Ben Loer, a PNNL research physicist who is working both in dark matter detection and radiation effects on superconducting qubits. “We may also be able to use our experience with these particle physics sensors to improve future superconducting qubit designs.” More

  • in

    New study warns: We have underestimated the pace at which the Arctic is melting

    Temperatures in the Arctic Ocean between Canada, Russia and Europe are warming faster than researchers’ climate models have been able to predict.
    Over the past 40 years, temperatures have risen by one degree every decade, and even more so over the Barents Sea and around Norway’s Svalbard archipelago, where they have increased by 1.5 degrees per decade throughout the period.
    This is the conclusion of a new study published in Nature Climate Change.
    “Our analyses of Arctic Ocean conditions demonstrate that we have been clearly underestimating the rate of temperature increases in the atmosphere nearest to the sea level, which has ultimately caused sea ice to disappear faster than we had anticipated,” explains Jens Hesselbjerg Christensen, a professor at the University of Copenhagen’s Niels Bohr Institutet (NBI) and one of the study’s researchers.
    Together with his NBI colleagues and researchers from the Universities of Bergen and Oslo, the Danish Metrological Institute and Australian National University, he compared current temperature changes in the Arctic with climate fluctuations that we know from, for example, Greenland during the ice age between 120,000-11,000 years ago.
    “The abrupt rise in temperature now being experienced in the Arctic has only been observed during the last ice age. During that time, analyses of ice cores revealed that temperatures over the Greenland Ice Sheet increased several times, between 10 to 12 degrees, over a 40 to 100-year period,” explains Jens Hesselbjerg Christensen.
    He emphasizes that the significance of the steep rise in temperature is yet to be fully appreciated. And, that an increased focus on the Arctic and reduced global warming, more generally, are musts.
    Climate models ought to take abrupt changes into account Until now, climate models predicted that Arctic temperatures would increase slowly and in a stable manner. However, the researchers’ analysis demonstrates that these changes are moving along at a much faster pace than expected.
    “We have looked at the climate models analysed and assessed by the UN Climate Panel. Only those models based on the worst-case scenario, with the highest carbon dioxide emissions, come close to what our temperature measurements show over the past 40 years, from 1979 to today,” says Jens Hesselbjerg Christensen.
    In the future, there ought to be more of a focus on being able to simulate the impact of abrupt climate change on the Arctic. Doing so will allow us to create better models that can accurately predict temperature increases:
    “Changes are occurring so rapidly during the summer months that sea ice is likely to disappear faster than most climate models have ever predicted. We must continue to closely monitor temperature changes and incorporate the right climate processes into these models,” says Jens Hesselbjerg Christensen. He concludes:
    “Thus, successfully implementing the necessary reductions in greenhouse gas emissions to meet the Paris Agreement is essential in order to ensure a sea-ice packed Arctic year-round.”

    Story Source:
    Materials provided by University of Copenhagen. Note: Content may be edited for style and length. More

  • in

    NBA playoff format is optimizing competitive balance by eliminating travel

    In addition to helping protect players from COVID-19, the NBA “bubble” in Orlando may be a competitive equalizer by eliminating team travel. Researchers analyzing the results of nearly 500 NBA playoff games over six seasons found that a team’s direction of travel and the number of time zones crossed were associated with its predicted win probability and actual game performance.
    Preliminary results of the study suggest that the 2020 NBA playoffs, which begin Aug. 17, will eliminate any advantages or disadvantages related to long-distance travel. In this year’s unique playoff format, implemented due to the COVID-19 pandemic, all 16 teams will stay in Orlando, Florida, and compete at the ESPN Wide World of Sports Complex in Walt Disney World.
    The study found that scoring was significantly higher following eastward travel. Although there were no differences in actual game outcomes based on overall direction of travel, there were differences when considering both the direction and magnitude of travel. Teams that traveled east with three-hour time zone changes had higher predicted probabilities of winning than teams that traveled west or played in the same time zone. In contrast, teams that traveled west across three time zones had lower predicted win probabilities than teams that traveled east or played in the same time zone.
    “During this initial study, it was interesting to find that team scoring improved during general eastward travel compared to westward travel and travel in the same zone, but game outcomes were unaffected by direction of travel during the playoffs,” said lead author Sean Pradhan, assistant professor of sports management and business analytics in the School of Business Administration at Menlo College in Atherton, California. “However, when considering the magnitude of travel across different time zones, we found that teams had predicted probabilities of winning that were lower after traveling three time zones westward, and tended to actually lose more games when traveling two time zones westward compared to most other types of travel.”
    Circadian rhythms are endogenous, near-24-hour biological rhythms that exist in all living organisms, and these daily rhythms have peaks and troughs in both alertness and sleepiness that can impact individuals in high-performance professions. Therefore, an athlete has a greater opportunity for optimal performance when the timing of an activity is synchronized with the body’s circadian clock.
    Researchers from Menlo College and other collaborators reviewed data from 499 NBA playoff games from the 2013-2014 through 2018-2019 seasons. They looked at the impact of direction of travel and time zones traveled on actual game outcomes, team quality, predicted win probability, and team scoring for visiting teams.
    “A great deal of prior work has examined the effects of travel and circadian advantages on team performance during the regular season of various professional sports leagues,” said Pradhan. “The current study extends such findings of previous research by examining team performance in the NBA playoffs, which is obviously an extremely crucial time for teams competing.”

    Story Source:
    Materials provided by American Academy of Sleep Medicine. Note: Content may be edited for style and length. More

  • in

    Revised code could help improve efficiency of fusion experiments

    An international team of researchers led by the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL) has upgraded a key computer code for calculating forces acting on magnetically confined plasma in fusion energy experiments. The upgrade will be part of a suite of computational tools that will allow scientists to further improve the design of breakfast-cruller-shaped facilities known as stellarators. Together, the three codes in the suite could help scientists bring efficient fusion reactors closer to reality.
    The revised software lets researchers more easily determine the boundary of plasma in stellarators. When used in concert with two other codes, the code could help find a stellarator configuration that improves the performance of the design. The two complementary codes determine the optimal location for the plasma in a stellarator vacuum chamber to maximize the efficiency of the fusion reactions, and determine the shape that the external electromagnets must have to hold the plasma in the proper position.
    The revised software, called the “free-boundary stepped-pressure equilibrium code (SPEC),” is one of a set of tools scientists can use to tweak the performance of plasma to more easily create fusion energy. “We want to optimize both the plasma position and the magnetic coils to balance the force that makes the plasma expand while holding it in place,” said Stuart Hudson, physicist, deputy head of the Theory Department at PPPL and lead author of the paper reporting the results in Plasma Physics and Controlled Fusion.
    “That way we can create a stable plasma whose particles are more likely to fuse. The updated SPEC code enables us to know where the plasma will be for a given set of magnetic coils.”
    Fusion combines light elements in the form of plasma — the hot, charged state of matter composed of free electrons and atomic nuclei — and in the process generates massive amounts of energy in the sun and stars. Scientists are seeking to replicate fusion in devices on Earth for a virtually inexhaustible supply of safe and clean power to generate electricity.
    Plasma stability is crucial for fusion. If plasma bounces around inside a stellarator, it can escape, cool, and tamp down the fusion reactions, in effect quenching the fusion fire. An earlier version of the code, also developed by Hudson, could only calculate how forces were affecting a plasma if the researchers already knew the plasma’s location. Researchers, however, typically don’t have that information. “That’s one of the problems with plasmas,” Hudson said. “They move all over the place.”
    The new version of the SPEC code helps solve the problem by allowing researchers to calculate the plasma’s boundary without knowing its position beforehand. Used in coordination with a coil-design code called FOCUS and an optimization code called STELLOPT — both of which were also developed at PPPL — SPEC lets physicists simultaneously ensure that the plasma will have the best fusion performance and the magnets will not be too complicated to build. “There’s no point optimizing the shape of the plasma and then later finding out that the magnets would be incredibly difficult to construct,” Hudson said.
    One challenge that Hudson and colleagues faced was verifying that each step of the code upgrade was done correctly. Their slow-and-steady approach was crucial to making sure that the code makes accurate calculations. “Let’s say you are designing a component that will go on a rocket to the moon,” Hudson said. “It’s very important that that part works. So you test and test and test.”
    Updating any computer code calls for a number of interlocking steps:
    First, scientists must translate a set of mathematical equations describing the plasma into a programming language that a computer can understand;
    Next, scientists must determine the mathematical steps needed to solve the equations;
    Finally, the scientists must verify that the code produces correct results, either by comparing the results with those produced by a code that has already been verified or using the code to solve simple equations whose answers are easy to check.
    Hudson and colleagues performed the calculations with widely different methods. They used pencil and paper to determine the equations and solution steps, and powerful PPPL computers to verify the results. “We demonstrated that the code works,” Hudson said. “Now it can be used to study current experiments and design new ones.”
    Collaborators on the paper include researchers at the Max Planck Institute for Plasma Physics, the Australian National University, and the Swiss École Polytechnique Fédérale de Lausanne. The research was supported by the DOE’s Office of Science (Fusion Energy Sciences), the Euratom research and training program, the Australian Research Council, and the Simons Foundation.

    Story Source:
    Materials provided by DOE/Princeton Plasma Physics Laboratory. Original written by Raphael Rosen. Note: Content may be edited for style and length. More

  • in

    Virtual imaging trials optimize CT, radiography for COVID-19

    An open-access article in ARRS’ American Journal of Roentgenology (AJR) established a foundation for the use of virtual imaging trials in effective assessment and optimization of CT and radiography acquisitions and analysis tools to help manage the coronavirus disease (COVID-19) pandemic.
    Virtual imaging trials have two main components–representative models of targeted subjects and realistic models of imaging scanners–and the authors of this AJR article developed the first computational models of patients with COVID-19, while showing, as proof of principle, how they can be combined with imaging simulators for COVID-19 imaging studies.
    “For the body habitus of the models,” lead author Ehsan Abadi explained, “we used the 4D extended cardiac-torso (XCAT) model that was developed at Duke University.”
    Abadi and his Duke colleagues then segmented the morphologic features of COVID-19 abnormalities from 20 CT images of patients with multidiagnostic confirmation of SARS-CoV-2 infection and incorporated them into XCAT models.
    “Within a given disease area, the texture and material of the lung parenchyma in the XCAT were modified to match the properties observed in the clinical images,” Abadi et al. continued.
    Using a specific CT scanner (Definition Flash, Siemens Healthineers) and validated radiography simulator (DukeSim) to help illustrate utility, the team virtually imaged three developed COVID-19 computational phantoms.
    “Subjectively,” the authors concluded, “the simulated abnormalities were realistic in terms of shape and texture,” adding their preliminary results showed that the contrast-to-noise ratios in the abnormal regions were 1.6, 3.0, and 3.6 for 5-, 25-, and 50-mAs images, respectively.
     
     

    Story Source:
    Materials provided by American Roentgen Ray Society. Note: Content may be edited for style and length. More

  • in

    Building mechanical memory boards using origami

    The ancient Japanese art of paper folding, known as origami, can be used to create mechanical, binary switches.
    In Applied Physics Letters, by AIP Publishing, researchers report the fabrication of such a paper device using a particular origami pattern known as the Kresling pattern. This device can act as a mechanical switch.
    By putting several of these together on a single platform, the investigators built a functioning mechanical memory board.
    Origami structures can be either rigid or nonrigid. For the first type, only the creases between panels of paper can deform, but the panels stay fixed. In nonrigid origami, however, the panels themselves can deform.
    The Kresling pattern is an example of nonrigid origami. Folding a piece of paper using this pattern generates a bellowslike structure that can flip between one orientation and another. The bellows act as a type of spring and can be controlled by vibrating a platform that holds the bellows. This creates a switch, which the investigators refer to as a Kresling-inspired mechanical switch, or KIMS.
    The researchers found that oscillating a platform holding the KIMS up and down at a certain speed will cause it to flip, or switch, between its two stable states. They used an electrodynamic shaker to provide controlled movements of the base and monitored the upper surface of the KIMS using a laser. In this way, they were able to map out and analyze the basic physics that underlies the switching behavior.
    “We used the Kresling origami pattern to also develop a cluster of mechanical binary switches,” author Ravindra Masana said. “These can be forced to transition between two different static states using a single controlled input in the form of a harmonic excitation applied at the base of the switch.”
    The group first considered a 2-bit memory board created by placing two KIMS units on a single platform. Because each KIMS bit has two stable states, four distinct states identified as S00, S01, S10 and S11 can be obtained. Oscillations of the platform will cause switching between these four stable states. This proof of concept with just two bits could be extended to multiple KIMS units, creating a type of mechanical memory.
    “Such switches can be miniaturized,” said Mohammed Daqaq, one of the authors and the director of the Laboratory of Applied Nonlinear Dynamics at NYU Abu Dhabi. “Instead of using a bulky electrodynamic shaker for actuation, the memory board can then be actuated using scalable piezoelectric and graphene actuators.”
    Miniaturized origami memory boards should have wide applicability and hold great promise for future device development.

    Story Source:
    Materials provided by American Institute of Physics. Note: Content may be edited for style and length. More