More stories

  • in

    Standard digital camera and AI to monitor soil moisture for affordable smart irrigation

    Researchers at UniSA have developed a cost-effective new technique to monitor soil moisture using a standard digital camera and machine learning technology.
    The United Nations predicts that by 2050 many areas of the planet may not have enough fresh water to meet the demands of agriculture if we continue our current patterns of use.
    One solution to this global dilemma is the development of more efficient irrigation, central to which is precision monitoring of soil moisture, allowing sensors to guide ‘smart’ irrigation systems to ensure water is applied at the optimum time and rate.
    Current methods for sensing soil moisture are problematic — buried sensors are susceptible to salts in the substrate and require specialised hardware for connections, while thermal imaging cameras are expensive and can be compromised by climatic conditions such as sunlight intensity, fog, and clouds.
    Researchers from The University of South Australia and Baghdad’s Middle Technical University have developed a cost-effective alternative that may make precision soil monitoring simple and affordable in almost any circumstance.
    A team including UniSA engineers Dr Ali Al-Naji and Professor Javaan Chahl has successfully tested a system that uses a standard RGB digital camera to accurately monitor soil moisture under a wide range of conditions.

    advertisement

    “The system we trialled is simple, robust and affordable, making it promising technology to support precision agriculture,” Dr Al-Naji says.
    “It is based on a standard video camera which analyses the differences in soil colour to determine moisture content. We tested it at different distances, times and illumination levels, and the system was very accurate.”
    The camera was connected to an artificial neural network (ANN) a form of machine learning software that the researchers trained to recognise different soil moisture levels under different sky conditions.
    Using this ANN, the monitoring system could potentially be trained to recognise the specific soil conditions of any location, allowing it to be customised for each user and updated for changing climatic circumstances, ensuing maximum accuracy.
    “Once the network has been trained it should be possible to achieve controlled irrigation by maintaining the appearance of the soil at the desired state,” Prof Chahl says.
    “Now that we know the monitoring method is accurate, we are planning to design a cost-effective smart-irrigation system based on our algorithm using a microcontroller, USB camera and water pump that can work with different types of soils.
    “This system holds promise as a tool for improved irrigation technologies in agriculture in terms of cost, availability and accuracy under changing climatic conditions.”

    Story Source:
    Materials provided by University of South Australia. Note: Content may be edited for style and length. More

  • in

    Calls to poison centers about high-powered magnets increased by 444% after ban lifted

    High-powered magnets are small, shiny magnets made from powerful rare earth metals. Since they started showing up in children’s toys in the early 2000s and then later in desk sets in 2009, high-powered magnets have caused thousands of injuries and are considered to be among the most dangerous ingestion hazards in children.
    When more than one is swallowed, these high-powered magnets attract to each other across tissue, cutting off blood supply to the bowel and causing obstructions, tissue necrosis, sepsis and even death. The U.S. Consumer Product Safety Commission (CPSC) found them dangerous enough that in 2012 they halted the sale of high-powered magnet sets and instituted a recall followed by a federal rule that effectively eliminated the sale of these products. This rule was overturned by the U.S. Court of Appeals in December 2016.
    A recent study led by researchers at the Center for Injury Research and Policy, Emergency Medicine, and the Central Ohio Poison Center at Nationwide Children’s Hospital along with the Children’s Hospital at Montefiore (CHAM) analyzed calls to U.S. poison centers for magnet exposures in children age 19 years and younger from 2008 through October 2019 to determine the impact of the CPSC rule and the subsequent lift of the ban.
    The study, recently published in Journal of Pediatrics, found that the average number of cases per year decreased 33% from 2012 to 2017 after high-powered magnet sets were removed from the market. When the ban was lifted and high-powered magnet sets re-entered the market, the average number of cases per year increased 444%. There was also a 355% increase in the number of cases that were serious enough to require treatment in a hospital. Cases from 2018 and 2019 increased across all age groups and accounted for 39% of magnet cases since 2008.
    “Regulations on these products were effective, and the dramatic increase in the number of high-powered magnet related injuries since the ban was lifted — even compared to pre-ban numbers — is alarming,” said Leah Middelberg, MD, lead author of the study and emergency medicine physician at Nationwide Children’s. “Parents don’t always know if their child swallowed something or what they swallowed — they just know their child is uncomfortable — so when children are brought in, an exam and sometimes x-rays are needed to determine what’s happening. Because damage caused by magnets can be serious, it’s so important to keep these kinds of magnets out of reach of children, and ideally out of the home.”
    The study found a total of 5,738 magnet exposures during the nearly 12-year study period. Most calls were for children who were male (55%), younger than six years (62%), with an unintentional injury (84%). Approximately one-half (48.4%) of patients were treated at a hospital or other healthcare facility while 48.7% were managed at a non-healthcare site such as a home, workplace, or school. Children in older age groups were more likely than younger children to be admitted to the hospital.
    “While many cases occur among young children, parents need to be aware that high-powered magnets are a risk for teenagers as well,” said Bryan Rudolph, MD, MPH, co-senior author of this study and gastroenterologist at CHAM. “Serious injuries can happen when teens use these products to mimic tongue or lip piercings. If there are children or teens who live in or frequently visit your home, don’t buy these products. If you have high-powered magnets in your home, throw them away. The risk of serious injury is too great.”
    “Significant increases in magnet injuries correspond to time periods in which high-powered magnet sets were sold, including a 444% increase since 2018,” said Middelberg. “These data reflect the urgent need to protect children by preventive measures and government action,” Rudolph emphasized. Both Middelberg and Rudolph support the federal legislation, “Magnet Injury Prevention Act,” which would limit the strength and/or size of magnets sold as part of a set, as well as reinstatement of a CPSC federal safety standard that would effectively restrict the sale of these magnet products in the U.S.

    Story Source:
    Materials provided by Nationwide Children’s Hospital. Note: Content may be edited for style and length. More

  • in

    Engineers combine AI and wearable cameras in self-walking robotic exoskeletons

    Robotics researchers are developing exoskeletons and prosthetic legs capable of thinking and making control decisions on their own using sophisticated artificial intelligence (AI) technology.
    The system combines computer vision and deep-learning AI to mimic how able-bodied people walk by seeing their surroundings and adjusting their movements.
    “We’re giving robotic exoskeletons vision so they can control themselves,” said Brokoslaw Laschowski, a PhD candidate in systems design engineering who leads a University of Waterloo research project called ExoNet.
    Exoskeletons legs operated by motors already exist, but users must manually control them via smartphone applications or joysticks.
    “That can be inconvenient and cognitively demanding,” said Laschowski, also a student member of the Waterloo Artificial Intelligence Institute (Waterloo.ai). “Every time you want to perform a new locomotor activity, you have to stop, take out your smartphone and select the desired mode.”
    To address that limitation, the researchers fitted exoskeleton users with wearable cameras and are now optimizing AI computer software to process the video feed to accurately recognize stairs, doors and other features of the surrounding environment.
    The next phase of the ExoNet research project will involve sending instructions to motors so that robotic exoskeletons can climb stairs, avoid obstacles or take other appropriate actions based on analysis of the user’s current movement and the upcoming terrain.
    “Our control approach wouldn’t necessarily require human thought,” said Laschowski, who is supervised by engineering professor John McPhee, the Canada Research Chair in Biomechatronic System Dynamics. “Similar to autonomous cars that drive themselves, we’re designing autonomous exoskeletons and prosthetic legs that walk for themselves.”
    The researchers are also working to improve the energy efficiency of motors for robotic exoskeletons and prostheses by using human motion to self-charge the batteries.

    Story Source:
    Materials provided by University of Waterloo. Note: Content may be edited for style and length. More

  • in

    Computing clean water

    Water is perhaps Earth’s most critical natural resource. Given increasing demand and increasingly stretched water resources, scientists are pursuing more innovative ways to use and reuse existing water, as well as to design new materials to improve water purification methods. Synthetically created semi-permeable polymer membranes used for contaminant solute removal can provide a level of advanced treatment and improve the energy efficiency of treating water; however, existing knowledge gaps are limiting transformative advances in membrane technology. One basic problem is learning how the affinity, or the attraction, between solutes and membrane surfaces impacts many aspects of the water purification process.
    “Fouling — where solutes stick to and gunk up membranes — significantly reduces performance and is a major obstacle in designing membranes to treat produced water,” said M. Scott Shell, a chemical engineering professor at UC Santa Barbara, who conducts computational simulations of soft materials and biomaterials. “If we can fundamentally understand how solute stickiness is affected by the chemical composition of membrane surfaces, including possible patterning of functional groups on these surfaces, then we can begin to design next-generation, fouling-resistant membranes to repel a wide range of solute types.”
    Now, in a paper published in the Proceedings of the National Academy of Sciences (PNAS), Shell and lead author Jacob Monroe, a recent Ph.D. graduate of the department and a former member of Shell’s research group, explain the relevance of macroscopic characterizations of solute-to-surface affinity.
    “Solute-surface interactions in water determine the behavior of a huge range of physical phenomena and technologies, but are particularly important in water separation and purification, where often many distinct types of solutes need to be removed or captured,” said Monroe, now a postdoctoral researcher at the National Institute of Standards and Technology (NIST). “This work tackles the grand challenge of understanding how to design next-generation membranes that can handle huge yearly volumes of highly contaminated water sources, like those produced in oilfield operations, where the concentration of solutes is high and their chemistries quite diverse.”
    Solutes are frequently characterized as spanning a range from hydrophilic, which can be thought of as water-liking and dissolving easily in water, to hydrophobic, or water-disliking and preferring to separate from water, like oil. Surfaces span the same range; for example, water beads up on hydrophobic surfaces and spreads out on hydrophilic surfaces. Hydrophilic solutes like to stick to hydrophilic surfaces, and hydrophobic solutes stick to hydrophobic surfaces. Here, the researchers corroborated the expectation that “like sticks to like,” but also discovered, surprisingly, that the complete picture is more complex.
    “Among the wide range of chemistries that we considered, we found that hydrophilic solutes also like hydrophobic surfaces, and that hydrophobic solutes also like hydrophilic surfaces, though these attractions are weaker than those of like to like,” explained Monroe, referencing the eight solutes the group tested, ranging from ammonia and boric acid, to isopropanol and methane. The group selected small-molecule solutes typically found in produced waters to provide a fundamental perspective on solute-surface affinity.

    advertisement

    The computational research group developed an algorithm to repattern surfaces by rearranging surface chemical groups in order to minimize or maximize the affinity of a given solute to the surface, or alternatively, to maximize the surface affinity of one solute relative to that of another. The approach relied on a genetic algorithm that “evolved” surface patterns in a way similar to natural selection, optimizing them toward a particular function goal.
    Through simulations, the team discovered that surface affinity was poorly correlated to conventional methods of solute hydrophobicity, such as how soluble a solute is in water. Instead, they found a stronger connection between surface affinity and the way that water molecules near a surface or near a solute change their structures in response. In some cases, these neighboring waters were forced to adopt structures that were unfavorable; by moving closer to hydrophobic surfaces, solutes could then reduce the number of such unfavorable water molecules, providing an overall driving force for affinity.
    “The missing ingredient was understanding how the water molecules near a surface are structured and move around it,” said Monroe. “In particular, water structural fluctuations are enhanced near hydrophobic surfaces, compared to bulk water, or the water far away from the surface. We found that fluctuations drove the stickiness of every small solute types that we tested. ”
    The finding is significant because it shows that in designing new surfaces, researchers should focus on the response of water molecules around them and avoid being guided by conventional hydrophobicity metrics.
    Based on their findings, Monroe and Shell say that surfaces comprised of different types of molecular chemistries may be the key to achieving multiple performance goals, such as preventing an assortment of solutes from fouling a membrane.
    “Surfaces with multiple types of chemical groups offer great potential. We showed that not only the presence of different surface groups, but their arrangement or pattern, influence solute-surface affinity,” Monroe said. “Just by rearranging the spatial pattern, it becomes possible to significantly increase or decrease the surface affinity of a given solute, without changing how many surface groups are present.”
    According to the team, their findings show that computational methods can contribute in significant ways to next-generation membrane systems for sustainable water treatment.
    “This work provided detailed insight into the molecular-scale interactions that control solute-surface affinity,” said Shell, the John E. Myers Founder’s Chair in Chemical Engineering. “Moreover, it shows that surface patterning offers a powerful design strategy in engineering membranes are resistant to fouling by a variety of contaminants and that can precisely control how each solute type is separated out. As a result, it offers molecular design rules and targets for next-generation membrane systems capable of purifying highly contaminated waters in an energy-efficient manner.”
    Most of the surfaces examined were model systems, simplified to facilitate analysis and understanding. The researchers say that the natural next step will be to examine increasingly complex and realistic surfaces that more closely mimic actual membranes used in water treatment. Another important step to bring the modeling closer to membrane design will be to move beyond understanding merely how sticky a membrane is for a solute and toward computing the rates at which solutes move through membranes. More

  • in

    A computational guide to lead cells down desired differentiation paths

    There is a great need to generate various types of cells for use in new therapies to replace tissues that are lost due to disease or injuries, or for studies outside the human body to improve our understanding of how organs and tissues function in health and disease. Many of these efforts start with human induced pluripotent stem cells (iPSCs) that, in theory, have the capacity to differentiate into virtually any cell type in the right culture conditions. The 2012 Nobel Prize awarded to Shinya Yamanaka recognized his discovery of a strategy that can reprogram adult cells to become iPSCs by providing them with a defined set of gene-regulatory transcription factors (TFs). However, progressing from there to efficiently generating a wide range of cell types with tissue-specific differentiated functions for biomedical applications has remained a challenge.
    While the expression of cell type-specific TFs in iPSCs is the most often used cellular conversion technology, the efficiencies of guiding iPSC through different “lineage stages” to the fully functional differentiated state of, for example, a specific heart, brain, or immune cell currently are low, mainly because the most effective TF combinations cannot be easily pinpointed. TFs that instruct cells to pass through a specific cell differentiation process bind to regulatory regions of genes to control their expression in the genome. However, multiple TFs must function in the context of larger gene regulatory networks (GRNs) to drive the progression of cells through their lineages until the final differentiated state is reached.
    Now, a collaborative effort led by George Church, Ph.D. at Harvard’s Wyss Institute for Biologically Inspired Engineering and Harvard Medical School (HMS), and Antonio del Sol, Ph.D., who leads Computational Biology groups at CIC bioGUNE, a member of the Basque Research and Technology Alliance, in Spain, and at the Luxembourg Centre for Systems Biomedicine (LCSB, University of Luxembourg), has developed a computer-guided design tool called IRENE, which significantly helps increase the efficiency of cell conversions by predicting highly effective combinations of cell type-specific TFs. By combining IRENE with a genomic integration system that allows robust expression of selected TFs in iPSCs, the team demonstrated their approach to generate higher numbers of natural killer cells used in immune therapies, and melanocytes used in skin grafts, than other methods. In a scientific first, generated breast mammary epithelial cells, whose availability would be highly desirable for the repopulation of surgically removed mammary tissue. The study is published in Nature Communications.
    “In our group, the study naturally built on the ‘TFome’ project, which assembled a comprehensive library containing 1,564 human TFs as a powerful resource for the identification of TF combinations with enhanced abilities to reprogram human iPSCs to different target cell types,” said Wyss Core Faculty member Church. “The efficacy of this computational algorithm will boost a number of our tissue engineering efforts at the Wyss Institute and HMS, and as an open resource can do the same for many researchers in this burgeoning field.” Church is the lead of the Wyss Institute’s Synthetic Biology platform, and Professor of Genetics at HMS and of Health Sciences and Technology at Harvard and MIT.
    Tooling up
    Several computational tools have been developed to predict combinations of TFs for specific cell conversions, but almost exclusively these are based on the analysis of gene expression patterns in many cell types. Missing in these approaches was a view of the epigenetic landscape, the organization of the genome itself around genes and on the scale of entire chromosome sections which goes far beyond the sequence of the naked genomic DNA.

    advertisement

    “The changing epigenetic landscape in differentiating cells predicts areas in the genome undergoing physical changes that are critical for key TFs to gain access to their target genes. Analyzing these changes can inform more accurately about GRNs and their participating TFs that drive specific cell conversions,” said co-first author Evan Appleton, Ph.D. Appleton is a Postdoctoral Fellow in Church’s group who joined forces with Sascha Jung, Ph.D., from del Sol’s group in the new study. “Our collaborators in Spain had developed a computational approach that integrated those epigenetic changes with changes in gene expression to produce critical TF combinations as an output, which we were in an ideal position to test.”
    The team used their computational “Integrative gene Regulatory Network model” (IRENE) approach to reconstruct the GRN controlling iPSCs, and then focused on three target cell types with clinical relevance to experimentally validate TF combinations prioritized by IRENE. To deliver TF combinations into iPSCs, they deployed a transposon-based genomic integration system that can integrate multiple copies of a gene encoding a TF into the genome, which allows all factors of a combination to be stably expressed. Transposons are DNA elements that can jump from one position of the genome to another, or in this case from an exogenously provided piece of DNA into the genome.
    “Our research team composed of scientists from the LCSB and CIC bioGUNE has a long-standing expertise in developing computational methods to facilitate cell conversion. IRENE is an additional resource in our toolbox and one for which experimental validation has demonstrated it substantially increased efficiency in most tested cases,” corresponding author Del Sol, who is Professor at LCSB and CIC bioGUNE. “Our fundamental research should ultimately benefit patients, and we are thrilled that IRENE could enhance the production of cell sources readily usable in therapeutic applications, such as cell transplantation and gene therapies.”
    Validating the computer-guided design tool in cells
    The researchers chose human mammary epithelial cells (HMECs) as a first cell type. Thus far HMECs are obtained from one tissue environment, dissociated, and transplanted to one where breast tissue has been resected. HMECs generated from patients’ cells, via an intermediate iPSC stage, could provide a means for less invasive and more effective breast tissue regeneration. One of the combinations that was generated by IRENE enabled the team to convert 14% of iPSCs into differentiated HMECs in iPSC-specific culture media, showing that the provided TFs were sufficient to drive the conversion without help from additional factors.
    The team then turned their attention to melanocytes, which can provide a source of cells in cellular grafts to replace damaged skin. This time they performed the cell conversion in melanocyte destination medium to show that the selected TFs work under culture conditions optimized for the desired cell type. Two out of four combinations were able to increase the efficiency of melanocyte conversion by 900% compared to iPSCs grown in destination medium without the TFs. Finally, the researchers compared combinations of TFs prioritized by IRENE to generate natural killer (NK) cells with a state-of-the-art differentiation method based on cell culture conditions alone. Immune NK cells have been found to improve the treatment of leukemia. The researchers’ approach outperformed the standard with five out of eight combinations increasing the differentiation of NK cells with critical markers by up to 250%.
    “This novel computational approach could greatly facilitate a range of cell and tissue engineering efforts at the Wyss Institute and many other sites around the world. This advance should greatly expand our toolbox as we strive to develop new approaches in regenerative medicine to improve patients’ lives,” said Wyss Founding Director Donald Ingber, M.D., Ph.D., who is also the Judah Folkman Professor of Vascular Biology at HMS and Boston Children’s Hospital, and Professor of Bioengineering at the Harvard John A. Paulson School of Engineering and Applied Sciences. More

  • in

    New perovskite LED emits a circularly polarized glow

    Light-emitting diodes (LEDs) have revolutionized the displays industry. LEDs use electric current to produce visible light without the excess heat found in traditional light bulbs, a glow called electroluminescence. This breakthrough led to the eye-popping, high-definition viewing experience we’ve come to expect from our screens. Now, a group of physicists and chemists have developed a new type of LED that utilizes spintronics without needing a magnetic field, magnetic materials or cryogenic temperatures; a “quantum leap” that could take displays to the next level.
    “The companies that make LEDs or TV and computer displays don’t want to deal with magnetic fields and magnetic materials. It’s heavy and expensive to do it,” said Valy Vardeny, distinguished professor of physics and astronomy at the University of Utah. “Here, chiral molecules are self-assembled into standing arrays, like soldiers, that actively spin polarize the injected electrons, which subsequently lead to circularly polarized light emission. With no magnetic field, expensive ferromagnets and with no need for extremely low temperatures. Those are no-nos for the industry.”
    Most opto-electronic devices, such as LEDs, only control charge and light and not the spin of the electrons. The electrons possess tiny magnetic fields that, like the Earth, have magnetic poles on opposite sides. Its spin may be viewed as the orientation of the poles and can be assigned binary information — an “up” spin is a “1,” a “down” is a “0.” In contrast, conventional electronics only transmit information through bursts of electrons along a conductive wire to convey messages in “1s” and “0s.” Spintronic devices, however, could utilize both methods, promising to process exponentially more information than traditional electronics.
    One barrier to commercial spintronics is setting the electron spin. Presently, one needs to produce a magnetic field to orient the electron spin direction. Researchers from the University of Utah and the National Renewable Energy Laboratory (NREL) developed technology that acts as an active spin filter made of two layers of material called chiral two-dimension metal-halide perovskites. The first layer blocks electrons having spin in the wrong direction, a layer that the authors call a chiral-induced spin filter. Then when the remaining electrons pass through the second light-emitting perovskite layer, they cause the layer to produce photons that move in unison along a spiral path, rather than a conventional wave pattern, to produce circular polarized electroluminescence.
    The study was published in the journal Science on March 12, 2021.
    Left-handed, right-handed molecules
    The scientists exploited a property called chirality that describes a particular type of geometry. Human hands are a classic example; the right and left hands are arranged as mirrors of one another, but they will never perfectly align, no matter the orientation. Some compounds, such as DNA, sugar and chiral metal-halide perovskites, have their atoms arranged in a chiral symmetry. A “left-handed” oriented chiral system may allow transport of electrons with “up” spins but block electrons with “down” spins, and vice versa.

    advertisement

    “If you try to transport electrons through these compounds, then the electron spin becomes aligned with the chirality of the material,” Vardeny said. Other spin filters do exist, but they either require some kind of magnetic field, or they can only manipulate electrons in a small area. “The beauty of the perovskite material that we used is that it’s two-dimensional — you can prepare many planes of 1 cm2 area that contain one million of a billion (1015) standing molecules with the same chirality.”
    Metal-halide perovskite semiconductors are mostly used for solar cells these days, as they are highly efficient at converting sunlight to electricity. Since a solar cell is one of the most demanding applications of any semiconductor, scientists are discovering other uses exist as well, including spin-LEDs.
    “We are exploring the fundamental properties of metal-halide perovskites, which has allowed us to discover new applications beyond photovoltaics,” said Joseph Luther, a co-author of the new paper and NREL scientist. “Because metal-halide perovskites, and other related metal halide organic hybrids, are some of the most fascinating semiconductors, they exhibit a host of novel phenomena that can be utilized in transforming energy.”
    Although metal-halide perovskites are the first to prove the chiral-hybrid devices are feasible, they are not the only candidates for spin-LEDs. The general formula for the active spin filter is one layer of an organic, chiral material, another layer of an inorganic metal halide, such as lead iodine, another organic layer, inorganic layer and so on.
    “That’s beautiful. I’d love that someone will come out with another 2-D organic/inorganic layer material that may do a similar thing. At this stage, it’s very general. I’m sure that with time, someone will find a different two-dimensional chiral material that will be even more efficient,” Vardeny said.
    The concept proves that using these two dimensional chiral-hybrid systems gain control over spin without magnets and has “broad implications for applications such as quantum-based optical computing, bioencoding and tomography,” according to Matthew Beard, a senior research fellow and director of Center for Hybrid Organic Inorganic Semiconductors for Energy.
    Vardeny and Xin Pan from the Department of Physics & Astronomy at the University of Utah co-authored the study. The other co-authors from NREL are Beard, Young-Hoon Kim, Yaxin Zhai, Haipeng Lu, Chuanxiao Xiao, E. Ashley Gaulding, Steven Harvey and Joseph Berry. All are part of CHOISE collaboration, an Energy Frontier Research Center (EFRC) funded by the Office of Science within DOE.
    Funding for the research came from CHOISE. More

  • in

    Remote control for quantum emitters

    In order to exploit the properties of quantum physics technologically, quantum objects and their interaction must be precisely controlled. In many cases, this is done using light. Researchers at the University of Innsbruck and the Institute of Quantum Optics and Quantum Information (IQOQI) of the Austrian Academy of Sciences have now developed a method to individually address quantum emitters using tailored light pulses. “Not only is it important to individually control and read the state of the emitters,” says Oriol Romero-Isart, “but also to do so while leaving the system as undisturbed as possible.” Together with Juan Jose Garcia-Ripoll (IQOQI visiting fellow) from the Instituto de Fisica Fundamental in Madrid, Romero-Isart’s research group has now investigated how specifically engineered pulses can be used to focus light on a single quantum emitter.
    Self-compressing light pulse
    “Our proposal is based on chirped light pulses,” explains Silvia Casulleras, first author of the research paper. “The frequency of these light pulses is time-dependent.” So, similar to the chirping of birds, the frequency of the signal changes over time. In structures with certain electromagnetic properties — such as waveguides — the frequencies propagate at different speeds. “If you set the initial conditions of the light pulse correctly, the pulse compresses itself at a certain distance,” explains Patrick Maurer from the Innsbruck team. “Another important part of our work was to show that the pulse enables the control of individual quantum emitters.” This approach can be used as a kind of remote control to address, for example, individual superconducting quantum bits in a waveguide or atoms near a photonic crystal.
    Wide range of applications
    In their work, now published in Physical Review Letters, the scientists show that this method works not only with light or electromagnetic pulses, but also with other waves such as lattice oscillations (phonons) or magnetic excitations (magnons). The research group led by the Innsbruck experimental physicist Gerhard Kirchmair, wants to implement the concept for superconducting qubits in the laboratory in close collaboration with the team of theorists.
    The research was financially supported by the European Union.

    Story Source:
    Materials provided by University of Innsbruck. Note: Content may be edited for style and length. More

  • in

    Experts recreate a mechanical Cosmos for the world's first computer

    Researchers at UCL have solved a major piece of the puzzle that makes up the ancient Greek astronomical calculator known as the Antikythera Mechanism, a hand-powered mechanical device that was used to predict astronomical events.
    Known to many as the world’s first analogue computer, the Antikythera Mechanism is the most complex piece of engineering to have survived from the ancient world. The 2,000-year-old device was used to predict the positions of the Sun, Moon and the planets as well as lunar and solar eclipses.
    Published in Scientific Reports, the paper from the multidisciplinary UCL Antikythera Research Team reveals a new display of the ancient Greek order of the Universe (Cosmos), within a complex gearing system at the front of the Mechanism.
    Lead author Professor Tony Freeth (UCL Mechanical Engineering) explained: “Ours is the first model that conforms to all the physical evidence and matches the descriptions in the scientific inscriptions engraved on the Mechanism itself.
    “The Sun, Moon and planets are displayed in an impressive tour de force of ancient Greek brilliance.”
    The Antikythera Mechanism has generated both fascination and intense controversy since its discovery in a Roman-era shipwreck in 1901 by Greek sponge divers near the small Mediterranean island of Antikythera.

    advertisement

    The astronomical calculator is a bronze device that consists of a complex combination of 30 surviving bronze gears used to predict astronomical events, including eclipses, phases of the moon, positions of the planets and even dates of the Olympics.
    Whilst great progress has been made over the last century to understand how it worked, studies in 2005 using 3D X-rays and surface imaging enabled researchers to show how the Mechanism predicted eclipses and calculated the variable motion of the Moon.
    However, until now, a full understanding of the gearing system at the front of the device has eluded the best efforts of researchers. Only about a third of the Mechanism has survived, and is split into 82 fragments — creating a daunting challenge for the UCL team.
    The biggest surviving fragment, known as Fragment A, displays features of bearings, pillars and a block. Another, known as Fragment D, features an unexplained disk, 63-tooth gear and plate.
    Previous research had used X-ray data from 2005 to reveal thousands of text characters hidden inside the fragments, unread for nearly 2,000 years. Inscriptions on the back cover include a description of the cosmos display, with the planets moving on rings and indicated by marker beads. It was this display that the team worked to reconstruct.

    advertisement

    Two critical numbers in the X-rays of the front cover, of 462 years and 442 years, accurately represent cycles of Venus and Saturn respectively. When observed from Earth, the planets’ cycles sometimes reverse their motions against the stars. Experts must track these variable cycles over long time-periods in order to predict their positions.
    “The classic astronomy of the first millennium BC originated in Babylon, but nothing in this astronomy suggested how the ancient Greeks found the highly accurate 462-year cycle for Venus and 442-year cycle for Saturn,” explained PhD candidate and UCL Antikythera Research Team member Aris Dacanalis.
    Using an ancient Greek mathematical method described by the philosopher Parmenides, the UCL team not only explained how the cycles for Venus and Saturn were derived but also managed to recover the cycles of all the other planets, where the evidence was missing.
    PhD candidate and team member David Higgon explained: “After considerable struggle, we managed to match the evidence in Fragments A and D to a mechanism for Venus, which exactly models its 462-year planetary period relation, with the 63-tooth gear playing a crucial role.”
    Professor Freeth added: “The team then created innovative mechanisms for all of the planets that would calculate the new advanced astronomical cycles and minimize the number of gears in the whole system, so that they would fit into the tight spaces available.”
    “This is a key theoretical advance on how the Cosmos was constructed in the Mechanism,” added co-author, Dr Adam Wojcik (UCL Mechanical Engineering). “Now we must prove its feasibility by making it with ancient techniques. A particular challenge will be the system of nested tubes that carried the astronomical outputs.” More