More stories

  • in

    Scientists craft living human skin for robots

    From action heroes to villainous assassins, biohybrid robots made of both living and artificial materials have been at the center of many sci-fi fantasies, inspiring today’s robotic innovations. It’s still a long way until human-like robots walk among us in our daily lives, but scientists from Japan are bringing us one step closer by crafting living human skin on robots. The method developed, presented June 9 in the journal Matter, not only gave a robotic finger skin-like texture, but also water-repellent and self-healing functions.
    “The finger looks slightly ‘sweaty’ straight out of the culture medium,” says first author Shoji Takeuchi, a professor at the University of Tokyo, Japan. “Since the finger is driven by an electric motor, it is also interesting to hear the clicking sounds of the motor in harmony with a finger that looks just like a real one.”
    Looking “real” like a human is one of the top priorities for humanoid robots that are often tasked to interact with humans in healthcare and service industries. A human-like appearance can improve communication efficiency and evoke likability. While current silicone skin made for robots can mimic human appearance, it falls short when it comes to delicate textures like wrinkles and lacks skin-specific functions. Attempts at fabricating living skin sheets to cover robots have also had limited success, since it’s challenging to conform them to dynamic objects with uneven surfaces.
    “With that method, you have to have the hands of a skilled artisan who can cut and tailor the skin sheets,” says Takeuchi. “To efficiently cover surfaces with skin cells, we established a tissue molding method to directly mold skin tissue around the robot, which resulted in a seamless skin coverage on a robotic finger.”
    To craft the skin, the team first submerged the robotic finger in a cylinder filled with a solution of collagen and human dermal fibroblasts, the two main components that make up the skin’s connective tissues. Takeuchi says the study’s success lies within the natural shrinking tendency of this collagen and fibroblast mixture, which shrank and tightly conformed to the finger. Like paint primers, this layer provided a uniform foundation for the next coat of cells — human epidermal keratinocytes — to stick to. These cells make up 90% of the outermost layer of skin, giving the robot a skin-like texture and moisture-retaining barrier properties.
    The crafted skin had enough strength and elasticity to bear the dynamic movements as the robotic finger curled and stretched. The outermost layer was thick enough to be lifted with tweezers and repelled water, which provides various advantages in performing specific tasks like handling electrostatically charged tiny polystyrene foam, a material often used in packaging. When wounded, the crafted skin could even self-heal like humans’ with the help of a collagen bandage, which gradually morphed into the skin and withstood repeated joint movements.
    “We are surprised by how well the skin tissue conforms to the robot’s surface,” says Takeuchi. “But this work is just the first step toward creating robots covered with living skin.” The developed skin is much weaker than natural skin and can’t survive long without constant nutrient supply and waste removal. Next, Takeuchi and his team plan to address those issues and incorporate more sophisticated functional structures within the skin, such as sensory neurons, hair follicles, nails, and sweat glands.
    “I think living skin is the ultimate solution to give robots the look and touch of living creatures since it is exactly the same material that covers animal bodies,” says Takeuchi.
    This work was supported by funding from JSPS Grants-in-Aid for Scientific Research (KAKENHI) and JSPS Grant-in-Aid for Early-Career Scientists (KAKENHI).
    Story Source:
    Materials provided by Cell Press. Note: Content may be edited for style and length. More

  • in

    Researchers demonstrate 40-channel optical communication link

    Researchers have demonstrated a silicon-based optical communication link that combines two multiplexing technologies to create 40 optical data channels that can simultaneously move data. The new chip-scale optical interconnect can transmit about 400 GB of data per second — the equivalent of about 100,000 streaming movies. This could improve data-intensive internet applications from video streaming services to high-capacity transactions for the stock market.
    “As demands to move more information across the internet continue to grow, we need new technologies to push data rates further,” said Peter Delfyett, who led the University of Central Florida College of Optics and Photonics (CREOL) research team. “Because optical interconnects can move more data than their electronic counterparts, our work could enable better and faster data processing in the data centers that form the backbone of the internet.”
    A multi-institutional group of researchers describes the new optical communication link in the Optica Publishing Group journal Optics Letters. It achieves 40 channels by combining a frequency comb light source based on a new photonic crystal resonator developed by the National Institute of Standards and Technology (NIST) with an optimized mode-division multiplexer designed by the researchers at Stanford University. Each channel can be used to carry information much like different stereo channels, or frequencies, transmit different music stations.
    “We show that these new frequency combs can be used in fully integrated optical interconnects,” said Chinmay Shirpurkar, co-first author of the paper. “All the photonic components were made from silicon-based material, which demonstrates the potential for making optical information handling devices from low-cost, easy-to-manufacture optical interconnects.”
    In addition to improving internet data transmission, the new technology could also be used to make faster optical computers that could provide the high levels of computing power needed for artificial intelligence, machine learning, large-scale emulation and other applications.
    Using multiple light dimensions
    The new work involved research teams led by Firooz Aflatouni of the University of Pennsylvania, Scott B. Papp from NIST, Jelena Vuckovic from Stanford University and Delfyett from CREOL. It is part of the DARPA Photonics in the Package for Extreme Scalability (PIPES) program, which aims to use light to vastly improve the digital connectivity of packaged integrated circuits using microcomb-based light sources. More

  • in

    Artificial intelligence reveals a never-before described 3D structure in rotavirus spike protein

    Of the three groups of rotavirus that cause gastroenteritis in people, called groups A, B and C, groups A and C affect mostly children and are the best characterized. On the other hand, of group B, which causes severe diarrhea predominantly in adults, little is known about the tip of the virus’s spike protein, called VP8* domain, which mediates the infection of cells in the gut.
    “Determining the structure of VP8* in group B rotavirus is important because it will help us understand how the virus infects gastrointestinal cells and design strategies to prevent and treat this infection that causes severe diarrheal outbreaks,” said corresponding author Dr B. V. Venkataram Prasad, professor of biochemistry and molecular biology at Baylor College of Medicine.
    The team’s first step was to determine the 3D structure of VP8* B using X-Ray crystallography, a laborious and time-consuming process. However, this traditional approach was unsuccessful in this case. The researchers then turned to a recently developed artificial intelligence-based computational program called AlphaFold2.
    “AlphaFold2 predicts the 3D structure of proteins according to their genetic sequence,” said first author and co-corresponding author Dr. Liya Hu, assistant professor of biochemistry and molecular biology at Baylor. “We knew that the protein sequence of VP8* of rotavirus group B was about 10% similar to the sequences of VP8* of rotavirus A and C, so we expected differences in the 3D structure as well. But we were surprised when AlphaFold2 predicted a 3D structure for the VP8* B that was not just totally different from that of the VP8* domain in rotavirus A and C, but also that no other protein before had been reported to have this structure.”
    With this information in hand, the researchers went back to the lab bench and experimentally confirmed that the structure of VP8* B predicted by ALphaFold2 indeed coincided with the actual structure of the protein using X-ray crystallography.
    How rotavirus infects cells
    Previous research has shown that rotavirus A and C infect cells by using the VP8* domain to bind to specific sugar components on histo-blood group antigens, including the A, B, AB and O blood groups, present in many cells in the body. It has been proposed that the ability of different rotavirus to bind to different sugars on the histo-group antigens might explain why some of these viruses specifically infect young children while others affect other populations. Unlike the VP8* A and VP8* C, the sugar specificity of VP8* B had not been characterized until now. More

  • in

    Ancient penguin bones reveal unprecedented shrinkage in key Antarctic glaciers

    Antarctica’s Pine Island and Thwaites glaciers are losing ice more quickly than they have at any time in the last few thousand years, ancient penguin bones and limpet shells suggest.

    Scientists are worried that the glaciers, two of Antarctica’s fastest-shrinking ones, are in the process of unstable, runaway retreat. By reconstructing the history of the glaciers using the old bones and shells, researchers wanted to find out whether these glaciers have ever been smaller than they are today.

    Sign Up For the Latest from Science News

    Headlines and summaries of the latest Science News articles, delivered to your inbox

    Thank you for signing up!

    There was a problem signing you up.

    “If the ice has been smaller in the past, and did readvance, that shows that we’re not necessarily in runaway retreat” right now, says glacial geologist Brenda Hall of the University of Maine in Orono. The new result, described June 9 in Nature Geoscience, “doesn’t give us any comfort,” Hall says. “We can’t refute the hypothesis of a runaway retreat.”

    Pine Island and Thwaites glaciers sit in a broad ocean basin shaped like a bowl, deepening toward the middle. This makes the ice vulnerable to warm currents of dense, salty water that hug the ocean floor (SN: 4/9/21). Scientists have speculated that as the glaciers retreat farther inland, they could tip into an irreversible collapse (SN: 12/13/21).  That collapse could play out over centuries and raise the sea level by roughly a meter.

    Researchers dated ancient shorelines (seen here as the series of small ridges in the rocky terrain between the foreground boulders and background snow) on islands roughly 100 kilometers from Pine Island and Thwaites glaciers in Antarctica to help figure out if the glaciers are in the process of unstable, runaway retreat.James Kirkham

    To reconstruct how the glaciers have changed over thousands of years, the researchers turned to old penguin bones and shells, collected by Scott Braddock, a glacial geologist in Hall’s lab, during a research cruise in 2019 on the U.S. icebreaker Nathaniel B. Palmer.

    One afternoon, Braddock clambered from a bobbing inflatable boat onto the barren shores of Lindsey 1 — one of a dozen or more rocky islands that sit roughly 100 kilometers from where Pine Island Glacier terminates in the ocean. As he climbed the slope, his boots slipped over rocks covered in penguin guano and dotted with dingy white feathers. Then, he came upon a series of ridges — rocks and pebbles that were piled up by waves during storms thousands of years before — that marked ancient shorelines.

    Twelve thousand years ago, just as the last ice age was ending, this island would have been entirely submerged in the ocean. But as nearby glaciers shed billions of metric tons of ice, the removal of that weight allowed Earth’s crust to spring up like a bed mattress — pushing Lindsey 1 and other nearby islands out of the water, a few millimeters per year.

    As Lindsey 1 rose, a series of shorelines formed on the edges of the island — and then were lifted, one after another, out of reach of the waves. By measuring the ages and heights of those stranded shorelines, the researchers could tell how quickly the island had risen. Because the rate of uplift is determined by the amount of ice being lost from nearby glaciers, this would reveal how quickly Pine Island and Thwaites glaciers had retreated — and whether they had gotten smaller than they are today and then readvanced.

    Braddock dug into the pebbly ridges, collecting ancient cone-shaped limpet shells and marble-sized fragments of penguin bones deposited when the shorelines formed. Back in Maine, he and his colleagues radiocarbon dated those objects to estimate the ages of the shorelines. Ultimately, the researchers dated nearly two dozen shorelines, spread across several islands in the region.

    These dates showed that the oldest and highest beach formed 5,500 years ago. Since that time, up until the last few decades, the islands have risen at a steady rate of about 3.5 millimeters per year. This is far slower than the 20 to 40 millimeters per year that the land around Pine Island and Thwaites is currently rising, suggesting that the rate of ice loss from nearby glaciers has skyrocketed due to the onset of rapid human-caused warming, after thousands of years of relative stability.

    “We’re going into unknown territory,” Braddock says. “We don’t have an analog to compare what’s going on today with what happened in the past.”

    Slawek Tulaczyk, a glaciologist at the University of California, Santa Cruz, sees the newly dated shorelines as “an important piece of information.” But he cautions against overinterpreting the results. While these islands are 100 kilometers from Pine Island and Thwaites, they are less than 50 kilometers from several smaller glaciers — and changes in these closer glaciers might have obscured whatever was happening at Pine Island and Thwaites long ago. He suspects that Pine Island and Thwaites could still have retreated and then readvanced a few dozen kilometers: “I don’t think this study settles it.” More

  • in

    Paving the way for faster computers, longer-lasting batteries

    University of Queensland scientists have cracked a problem that’s frustrated chemists and physicists for years, potentially leading to a new age of powerful, efficient, and environmentally friendly technologies.
    Using quantum mechanics, Professor Ben Powell from UQ’s School of Mathematics and Physics has discovered a ‘recipe’ which allows molecular switches to work at room temperature.
    “Switches are materials that can shift between two or more states, such as on and off or 0 and 1, and are the basis of all digital technologies,” Professor Powell said.
    “This discovery paves the way for smaller and more powerful and energy efficient technologies.
    “You can expect batteries will last longer and computers to run faster.”
    Until now, molecular switching has only been possible when the molecules are extremely cold — at temperatures below minus 250 degrees centigrade. More

  • in

    Radio waves for the detection of hardware tampering

    As far as data security is concerned, there is an even greater danger than remote cyberattacks: namely tampering with hardware that can be used to read out information — such as credit card data from a card reader. Researchers in Bochum have developed a new method to detect such manipulations. They monitor the systems with radio waves that react to the slightest changes in the ambient conditions. Unlike conventional methods, they can thus protect entire systems, not just individual components — and they can do it at a lower cost. The RUB’s science magazine Rubin features a report by the team from Ruhr-Universität Bochum (RUB), the Max Planck Institute for Security and Privacy and the IT company PHYSEC.
    Paul Staat and Johannes Tobisch presented their findings at the IEEE Symposium on Security and Privacy, which took place in the USA from 23 to 25 May 2022. Both researchers are doing their PhDs at RUB and conducting research at the Max Planck Institute for Security and Privacy in Bochum in Professor Christof Paar’s team. For their research, they are cooperating with Dr. Christian Zenger from the RUB spin-off company PHYSEC.
    Protection through radio waves
    Data is ultimately nothing more than electrical currents that travel between different computer components via conductive paths. A tiny metallic object, located in the right place on the hardware, can be enough to tap into the information streams. To date, only individual components of systems, such as a crucial memory element or a processor, can be protected from such manipulations. “Typically, this is done with a type of foil with thin wires in which the hardware component is wrapped,” explains Paul Staat. “If the foil is damaged, an alarm is triggered.”
    The radio wave technology from Bochum, however, can be used to monitor an entire system. To this end, the researchers install two antennas in the system: a transmitter and a receiver. The transmitter sends out a special radio signal that spreads everywhere in the system and is reflected by the walls and computer components. All these reflections cause a signal to reach the receiver that is as characteristic of the system as a fingerprint.
    Technology reacts to the slightest changes
    Tiny changes to the system are enough to have a noticeable effect on the fingerprint, as the team demonstrated in experiments. The IT experts equipped a conventional computer with radio antennas and punctured its housing with holes at regular intervals. Through these holes, the researchers let a fine metal needle penetrate the inside of the system and checked whether they notice the change in the radio signal. In the process, they varied the thickness of the needle, the position and the depth of penetration.
    With the computer running, they reliably detected the penetration of a needle 0.3 millimetres thick with their system from a penetration depth of one centimetre. The system still detected a needle that was only 0.1 millimetres thick — about as thick as a hair — but not in all positions. “The closer the needle is to the receiving antenna, the easier it is to detect, explains Staat. “Therefore, in practical applications, it makes sense to think carefully about where you place the antennas,” adds Tobisch. “They should be as close as possible to the components that require a high degree of protection.”
    Basically, the technology is suitable for both high-security applications and everyday problem. The IT company PHYSEC already uses it to prevent unauthorised manipulation of critical infrastructure components.
    Story Source:
    Materials provided by Ruhr-University Bochum. Original written by Julia Weiler. Note: Content may be edited for style and length. More

  • in

    Faster computing results without fear of errors

    Researchers have pioneered a technique that can dramatically accelerate certain types of computer programs automatically, while ensuring program results remain accurate.
    Their system boosts the speeds of programs that run in the Unix shell, a ubiquitous programming environment created 50 years ago that is still widely used today. Their method parallelizes these programs, which means that it splits program components into pieces that can be run simultaneously on multiple computer processors.
    This enables programs to execute tasks like web indexing, natural language processing, or analyzing data in a fraction of their original runtime.
    “There are so many people who use these types of programs, like data scientists, biologists, engineers, and economists. Now they can automatically accelerate their programs without fear that they will get incorrect results,” says Nikos Vasilakis, research scientist in the Computer Science and Artificial Intelligence Laboratory (CSAIL) at MIT.
    The system also makes it easy for the programmers who develop tools that data scientists, biologists, engineers, and others use. They don’t need to make any special adjustments to their program commands to enable this automatic, error-free parallelization, adds Vasilakis, who chairs a committee of researchers from around the world who have been working on this system for nearly two years.
    Vasilakis is senior author of the group’s latest research paper, which includes MIT co-author and CSAIL graduate student Tammam Mustafa and will be presented at the USENIX Symposium on Operating Systems Design and Implementation.Co-authors include lead author Konstantinos Kallas, a graduate student at the University of Pennsylvania; Jan Bielak, a student at Warsaw Staszic High School; Dimitris Karnikis, a software engineer at Aarno Labs; Thurston H.Y. Dang, a former MIT postdoc who is now a software engineer at Google; and Michael Greenberg, assistant professor of computer science at the Stevens Institute of Technology. More

  • in

    Bluetooth signals can be used to identify and track smartphones

    A team of engineers at the University of California San Diego has demonstrated for the first time that the Bluetooth signals emitted constantly by our mobile phones have a unique fingerprint that can be used to track individuals’ movements.
    Mobile devices, including phones, smartwatches and fitness trackers, constantly transmit signals, known as Bluetooth beacons, at the rate of roughly 500 beacons per minute.These beacons enable features like Apple’s “Find My” lost device tracking service; COVID-19 tracing apps; and connect smartphones to other devices such as wireless earphones.
    Prior research has shown that wireless fingerprinting exists in WiFi and other wireless technologies. The critical insight of the UC San Diego team was that this form of tracking can also be done with Bluetooth, in a highly accurate way.
    “This is important because in today’s world Bluetooth poses a more significant threat as it is a frequent and constant wireless signal emitted from all our personal mobile devices,” said Nishant Bhaskar, a Ph.D. student in the UC San Diego Department of Computer Science and Engineering and one of the paper’s lead authors.
    The team, which includes researchers from the Departments of Computer Science and Engineering and Electrical and Computer Engineering, presented its findings at the IEEE Security & Privacy conference in Oakland, Calif., on May 24, 2022.
    All wireless devices have small manufacturing imperfections in the hardware that are unique to each device. These fingerprints are an accidental byproduct of the manufacturing process. These imperfections in Bluetooth hardware result in unique distortions, which can be used as a fingerprint to track a specific device. For Bluetooth, this would allow an attacker to circumvent anti-tracking techniques such as constantly changing the address a mobile device uses to connect to Internet networks. More