More stories

  • in

    At the water's edge: Self-assembling 2D materials at a liquid-liquid interface

    The past few decades have witnessed a great amount of research in the field of two-dimensional (2D) materials. As the name implies, these thin film-like materials are composed of layers that are only a few atoms thick. Many of the chemical and physical properties of 2D materials can be fine-tuned, leading to promising applications in many fields, including optoelectronics, catalysis, renewable energy, and more.
    Coordination nanosheets are one particularly interesting type of 2D material. The “coordination” refers to the effect of metallic ions in these molecules, which act as coordination centers. These centers can spontaneously create organized molecular dispositions that span multiple layers in 2D materials. This has attracted the attention of materials scientists due to their favorable properties. In fact, we have only begun to scratch the surface regarding what heterolayer coordination nanosheets — coordination nanosheets whose layers have different atomic composition — can offer.
    In a recent study published first on June 13, 2022, and featured on the front cover of Chemistry — A European Journal, a team of scientists from Tokyo University of Science (TUS) and The University of Tokyo in Japan reported a remarkably simple way to synthesize heterolayer coordination nanosheets. Composed of the organic ligand, terpyridine, coordinating iron and cobalt, these nanosheets assemble themselves at the interface between two immiscible liquids in a peculiar way. The study, led by Prof. Hiroshi Nishihara from TUS, also included contributions from Mr. Joe Komeda, Dr. Kenji Takada, Dr. Hiroaki Maeda, and Dr. Naoya Fukui from TUS.
    To synthesize the heterolayer coordination nanosheets, the team first created the liquid-liquid interface to enable their assembly. They dissolved tris(terpyridine) ligand in dichloromethane (CH2Cl2), an organic liquid that does not mix with water. They then poured a solution of water and ferrous tetrafluoroborate, an iron-containing chemical, on top of the CH2Cl2. After 24 hours, the first layer of the coordination nanosheet, bis(terpyridine)iron (or “Fe-tpy”), formed at the interface between both liquids.
    Following this, they removed the iron-containing water and replaced it with cobalt-containing water. In the next few days, a bis(terpyridine)cobalt (or “Co-tpy”) layer formed right below the iron-containing one at the liquid-liquid interface.
    The team made detailed observations of the heterolayer using various advanced techniques, such as scanning electron microscopy, X-ray photoelectron spectroscopy, atomic force microscopy, and scanning transmission electron microscopy. They found that the Co-tpy layer formed neatly below the Fe-tpy layer at the liquid-liquid interface. Moreover, they could control the thickness of the second layer depending on how long they left the synthesis process run its course.
    Interestingly, the team also found that the ordering of the layers could be swapped by simply changing the order of the synthesis steps. In other words, if they first added a cobalt-containing solution and then replaced it with an iron-containing solution, the synthesized heterolayer would have cobalt coordination centers on the top layer and iron coordination centers on the bottom layer. “Our findings indicate that metal ions can go through the first layer from the aqueous phase to the CH2Cl2 phase to react with terpyridine ligands right at the boundary between the nanosheet and the CH2Cl2 phase,” explains Prof. Nishihara. “This is the first ever clarification of the growth direction of coordination nanosheets at a liquid/liquid interface.”
    Additionally, the team investigated the reduction-oxidation properties of their coordination nanosheets as well as their electrical rectification characteristics. They found that the heterolayers behaved much like a diode in a way that is consistent with the electronic energy levels of Co-tpy and Fe-tpy. These insights, coupled with the easy synthesis procedure developed by the team, could help in the design of heterolayer nanosheets made of other materials and tailored for specific electronics applications. “Our synthetic method could be applicable to other coordination polymers synthesized at liquid-liquid interfaces,” highlights Prof. Nishihara. “Therefore, the results of this study will expand the structural and functional diversity of molecular 2D materials.”
    With eyes set on the future, the team will keep investigating chemical phenomena occurring at liquid-liquid interfaces, elucidating the mechanisms of mass transport and chemical reactions. Their findings can help expand the design of 2D materials and, hopefully, lead to better performance of optoelectronic devices, such as solar cells.
    Story Source:
    Materials provided by Tokyo University of Science. Note: Content may be edited for style and length. More

  • in

    Electric nanomotor made from DNA material

    A research team led by the Technical University of Munich (TUM) has succeeded for the first time in producing a molecular electric motor using the DNA origami method. The tiny machine made of genetic material self-assembles and converts electrical energy into kinetic energy. The new nanomotors can be switched on and off, and the researchers can control the rotation speed and rotational direction.
    Be it in our cars, drills or the automatic coffee grinders — motors help us perform work in our everyday lives to accomplish a wide variety of tasks. On a much smaller scale, natural molecular motors perform vital tasks in our bodies. For instance, a motor protein known as ATP synthase produces the molecule adenosine triphosphate (ATP), which our body uses for short-term storage and transfer of energy.
    While natural molecular motors are essential, it has been quite difficult to recreate motors on this scale with mechanical properties roughly similar to those of natural molecular motors like ATP synthase. A research team has now constructed a working nanoscale molecular rotary motor using the DNA origami method. The team was led by Hendrik Dietz, Professor of Biomolecular Nanotechnology at TUM, Friedrich Simmel, Professor of Physics of Synthetic Biological Systems at TUM, and Ramin Golestanian, director at the Max Planck Institute for Dynamics and Self-Organization.
    A self-assembling nanomotor
    The novel molecular motor consists of DNA — genetic material. The researchers used the DNA origami method to assemble the motor from DNA molecules. This method was invented by Paul Rothemund in 2006 and was later further developed by the research team at TUM. Several long single strands of DNA serve as a basis to which additional DNA strands attach themselves to as counterparts. The DNA sequences are selected in such a way that the attached strands and folds create the desired structures.
    “We’ve been advancing this method of fabrication for many years and can now develop very precise and complex objects, such as molecular switches or hollow bodies that can trap viruses. If you put the DNA strands with the right sequences in solution, the objects self-assemble,” says Dietz.
    The new nanomotor made of DNA material consists of three components: base, platform and rotor arm. The base is approximately 40 nanometers high and is fixed to a glass plate in solution via chemical bonds on a glass plate. A rotor arm of up to 500 nanometers in length is mounted on the base so that it can rotate. Another component is crucial for the motor to work as intended: a platform that lies between the base and the rotor arm. This platform contains obstacles that influence the movement of the rotor arm. To pass the obstacles and rotate, the rotor arm must bend upward a little, similar to a ratchet.
    Targeted movement through AC voltage
    Without energy supply, the rotor arms of the motors move randomly in one direction or the other, driven by random collisions with molecules from the surrounding solvent. However, as soon as AC voltage is applied via two electrodes, the rotor arms rotate in a targeted and continuous manner in one direction.
    “The new motor has unprecedented mechanical capabilities: It can achieve torques in the range of 10 piconewton times nanometer. And it can generate more energy per second than what’s released when two ATP molecules are split,” explains Ramin Golestanian, who led the theoretical analysis of the mechanism of the motor.
    The targeted movement of the motors results from a superposition of the fluctuating electrical forces with the forces experienced by the rotor arm due to the ratchet obstacles. The underlying mechanism realizes a so-called “flashing Brownian ratchet.” The researchers can control the speed and direction of the rotation via the direction of the electric field and also via the frequency and amplitude of the AC voltage.
    “The new motor could also have technical applications in the future. If we develop the motor further we could possibly use it in the future to drive user-defined chemical reactions, inspired by how ATP synthase makes ATP driven by rotation. Then, for example, surfaces could be densely coated with such motors. Then you would add starting materials, apply a little AC voltage and the motors produce the desired chemical compound,” says Dietz. More

  • in

    Deep learning for new alloys

    When is something more than just the sum of its parts? Alloys show such synergy. Steel, for instance, revolutionized industry by taking iron, adding a little carbon and making an alloy much stronger than either of its components.
    Supercomputer simulations are helping scientists discover new types of alloys, called high-entropy alloys. Researchers have used the Stampede2 supercomputer of the Texas Advanced Computing Center (TACC) allocated by the Extreme Science and Engineering Discovery Environment (XSEDE).
    Their research was published in April 2022 in Npj Computational Materials. The approach could be applied to finding new materials for batteries, catalysts and more without the need for expensive metals such as platinum or cobalt.
    “High-entropy alloys represent a totally different design concept. In this case we try to mix multiple principal elements together,” said study senior author Wei Chen, associate professor of materials science and engineering at the Illinois Institute of Technology.
    The term “high entropy” in a nutshell refers to the decrease in energy gained from random mixing of multiple elements at similar atomic fractions, which can stabilize new and novel materials resulting from the ‘cocktail.’
    For the study, Chen and colleagues surveyed a large space of 14 elements and the combinations that yielded high-entropy alloys. They performed high-throughput quantum mechanical calculations, which found the alloy’s stability and elastic properties, the ability to regain their size and shape from stress, of more than 7,000 high-entropy alloys. More

  • in

    Robots learn household tasks by watching humans

    The robot watched as Shikhar Bahl opened the refrigerator door. It recorded his movements, the swing of the door, the location of the fridge and more, analyzing this data and readying itself to mimic what Bahl had done.
    It failed at first, missing the handle completely at times, grabbing it in the wrong spot or pulling it incorrectly. But after a few hours of practice, the robot succeeded and opened the door.
    “Imitation is a great way to learn,” said Bahl, a Ph.D. student at the Robotics Institute (RI) in Carnegie Mellon University’s School of Computer Science. “Having robots actually learn from directly watching humans remains an unsolved problem in the field, but this work takes a significant step in enabling that ability.”
    Bahl worked with Deepak Pathak and Abhinav Gupta, both faculty members in the RI, to develop a new learning method for robots called WHIRL, short for In-the-Wild Human Imitating Robot Learning. WHIRL is an efficient algorithm for one-shot visual imitation. It can learn directly from human-interaction videos and generalize that information to new tasks, making robots well-suited to learning household chores. People constantly perform various tasks in their homes. With WHIRL, a robot can observe those tasks and gather the video data it needs to eventually determine how to complete the job itself.
    The team added a camera and their software to an off-the-shelf robot, and it learned how to do more than 20 tasks — from opening and closing appliances, cabinet doors and drawers to putting a lid on a pot, pushing in a chair and even taking a garbage bag out of the bin. Each time, the robot watched a human complete the task once and then went about practicing and learning to accomplish the task on its own. The team presented their research this month at the Robotics: Science and Systems conference in New York.
    “This work presents a way to bring robots into the home,” said Pathak, an assistant professor in the RI and a member of the team. “Instead of waiting for robots to be programmed or trained to successfully complete different tasks before deploying them into people’s homes, this technology allows us to deploy the robots and have them learn how to complete tasks, all the while adapting to their environments and improving solely by watching.”
    Current methods for teaching a robot a task typically rely on imitation or reinforcement learning. In imitation learning, humans manually operate a robot to teach it how to complete a task. This process must be done several times for a single task before the robot learns. In reinforcement learning, the robot is typically trained on millions of examples in simulation and then asked to adapt that training to the real world.
    Both learning models work well when teaching a robot a single task in a structured environment, but they are difficult to scale and deploy. WHIRL can learn from any video of a human doing a task. It is easily scalable, not confined to one specific task and can operate in realistic home environments. The team is even working on a version of WHIRL trained by watching videos of human interaction from YouTube and Flickr.
    Progress in computer vision made the work possible. Using models trained on internet data, computers can now understand and model movement in 3D. The team used these models to understand human movement, facilitating training WHIRL.
    With WHIRL, a robot can accomplish tasks in their natural environments. The appliances, doors, drawers, lids, chairs and garbage bag were not modified or manipulated to suit the robot. The robot’s first several attempts at a task ended in failure, but once it had a few successes, it quickly latched on to how to accomplish it and mastered it. While the robot may not accomplish the task with the same movements as a human, that’s not the goal. Humans and robots have different parts, and they move differently. What matters is that the end result is the same. The door is opened. The switch is turned off. The faucet is turned on.
    “To scale robotics in the wild, the data must be reliable and stable, and the robots should become better in their environment by practicing on their own,” Pathak said.
    Story Source:
    Materials provided by Carnegie Mellon University. Original written by Aaron Aupperlee. Note: Content may be edited for style and length. More

  • in

    Alexa and Siri, listen up! Teaching machines to really hear us

    University of Virginia cognitive scientist Per Sederberg has a fun experiment you can try at home. Take out your smartphone and, using a voice assistant such as the one for Google’s search engine, say the word “octopus” as slowly as you can.
    Your device will struggle to reiterate what you just said. It might supply a nonsensical response, or it might give you something close but still off — like “toe pus.” Gross!
    The point is, Sederberg said, when it comes to receiving auditory signals like humans and other animals do — despite all of the computing power dedicated to the task by such heavyweights as Google, Deep Mind, IBM and Microsoft — current artificial intelligence remains a bit hard of hearing.
    The outcomes can range from comical and mildly frustrating to downright alienating for those who have speech problems.
    But using recent breakthroughs in neuroscience as a model, UVA collaborative research has made it possible to convert existing AI neural networks into technology that can truly hear us, no matter at what pace we speak.
    The deep learning tool is called SITHCon, and by generalizing input, it can understand words spoken at different speeds than a network was trained on. More

  • in

    Motion capture reveals why VAR in football struggles with offside decisions

    New research by the University of Bath has used motion capture technology to assess the accuracy of Video Assistant Referee (VAR) technologies in football. The study suggests that VAR is useful for preventing obvious mistakes but is currently not precise enough to give accurate judgements every time.
    VAR was introduced into association football in 2018 to help referees review decisions for goals, red cards, penalties and offsides. The technology uses film footage from pitch-side cameras, meaning that VAR operators can view the action from different angles and then offer their judgements on incidents to the head referee to make a final decision.
    However, the accuracy and application of VAR has also been questioned by some, including high profile pundits like Gary Lineker and Alan Shearer, following controversial decisions which can change the course of the game.
    Critics of VAR further argue that it hampers the flow of the game, however some research suggests it has reduced the number of fouls, offsides and yellow cards.
    Dr Pooya Soltani, from the University of Bath’s Centre for Analysis of Motion, Entertainment Research and Applications (CAMERA), used optical motion capture systems to assess the accuracy of VAR systems.
    He filmed a football player receiving the ball from a teammate, viewed from different camera angles, whilst recording the 3D positions of the ball and players using optical motion capture cameras. More

  • in

    Physicists use quantum simulation tools to study, understand exotic state of matter

    Physicists have demonstrated how simulations using quantum computing can enable observation of a distinctive state of matter taken out of its normal equilibrium. Such novel states of matter could one day lead to developments in fast, powerful quantum information storage and precision measurement science.
    Thomas Iadecola worked his way through the title of the latest research paper that includes his theoretical and analytical work, patiently explaining digital quantum simulation, Floquet systems and symmetry-protected topological phases.
    Then he offered explanations of nonequilibrium systems, time crystals, 2T periodicity and the 2016 Nobel Prize in Physics.
    Iadecola’s corner of quantum condensed matter physics — the study of how states of matter emerge from collections of atoms and subatomic particles — can be counterintuitive and needs an explanation at most every turn and term.
    The bottom line, as explained by the Royal Swedish Academy of Sciences in announcing that 2016 physics prize to David Thouless, Duncan Haldane and Michael Kosterlitz, is that researchers are revealing more and more of the secrets of exotic matter, “an unknown world where matter can assume strange states.”
    The new paper published in the journal Nature and co-authored by Iadecola, an Iowa State University assistant professor of physics and astronomy and an Ames National Laboratory scientist, describes simulations using quantum computing that enabled observation of a distinctive state of matter taken out of its normal equilibrium. More

  • in

    Idea of ice age 'species pump' in the Philippines boosted by new way of drawing evolutionary trees

    Does the Philippines’ astonishing biodiversity result in part from rising and falling seas during the ice ages?
    Scientists have long thought the unique geography of the Philippines — coupled with seesawing ocean levels — could have created a “species pump” that triggered massive diversification by isolating, then reconnecting, groups of species again and again on islands. They call the idea the “Pleistocene aggregate island complex (PAIC) model” of diversification.
    But hard evidence, connecting bursts of speciation to the precise times that global sea levels rose and fell, has been scant until now.
    A groundbreaking Bayesian method and new statistical analyses of genomic data from geckos in the Philippines shows that during the ice ages, the timing of gecko diversification gives strong statistical support for the first time to the PAIC model, or “species pump.” The investigation, with roots at the University of Kansas, was just published in the Proceedings of the National Academy of Sciences.
    “The Philippines is an isolated archipelago, currently including more than 7,100 islands, but this number was dramatically reduced, possibly to as few as six or seven giant islands, during the Pleistocene,” said co-author Rafe Brown, curator-in-charge of the herpetology division of the Biodiversity Institute and Natural History Museum at KU. “The aggregate landmasses were composed of many of today’s smaller islands, which became connected together by dry land as sea levels fell, and all that water was tied up in glaciers. It’s been hypothesized that this kind of fragmentation and fusion of land, which happened as sea levels repeatedly fluctuated over the last 4 million years, sets the stage for a special evolutionary process, which may have triggered simultaneous clusters or bursts of speciation in unrelated organisms present at the time. In this case, we tested this prediction in two different genera of lizards, each with species found only in the Philippines.”
    For decades, the Philippines has been a hotbed of fieldwork by biologists with KU’s Biodiversity Institute, where the authors analyzed genetic samples of Philippine geckos as well as other animals. However, even with today’s technology and scientists’ ability to characterize variation from across the genome, the development of powerful statistical approaches capable of handling genome-scale data is still catching up — particularly in challenging cases, like the task of estimating past times that species formed, using genetic data collected from populations surviving today. More