More stories

  • in

    Making AI smarter with an artificial, multisensory integrated neuron

    The feel of a cat’s fur can reveal some information, but seeing the feline provides critical details: is it a housecat or a lion? While the sound of fire crackling may be ambiguous, its scent confirms the burning wood. Our senses synergize to give a comprehensive understanding, particularly when individual signals are subtle. The collective sum of biological inputs can be greater than their individual contributions. Robots tend to follow more straightforward addition, but Penn State researchers have now harnessed the biological concept for application in artificial intelligence (AI) to develop the first artificial, multisensory integrated neuron.
    Led by Saptarshi Das, associate professor of engineering science and mechanics at Penn State, the team published their work on September 15 in Nature Communication.
    “Robots make decisions based on the environment they are in, but their sensors do not generally talk to each other,” said Das, who also has joint appointments in electrical engineering and in materials science and engineering. “A collective decision can be made through a sensor processing unit, but is that the most efficient or effective method? In the human brain, one sense can influence another and allow the person to better judge a situation.”
    For instance, a car might have one sensor scanning for obstacles, while another senses darkness to modulate the intensity of the headlights. Individually, these sensors relay information to a central unit which then instructs the car to brake or adjust the headlights. According to Das, this process consumes more energy. Allowing sensors to communicate directly with each other can be more efficient in terms of energy and speed — particularly when the inputs from both are faint.
    “Biology enables small organisms to thrive in environments with limited resources, minimizing energy consumption in the process,” said Das, who is also affiliated with the Materials Research Institute. “The requirements for different sensors are based on the context — in a dark forest, you’d rely more on listening than seeing, but we don’t make decisions based on just one sense. We have a complete sense of our surroundings, and our decision making is based on the integration of what we’re seeing, hearing, touching, smelling, etcetera. The senses evolved together in biology, but separately in AI. In this work, we’re looking to combine sensors and mimic how our brains actually work.”
    The team focused on integrating a tactile sensor and a visual sensor so that the output of one sensor modifies the other, with the help of visual memory. According to Muhtasim Ul Karim Sadaf, a third-year doctoral student in engineering science and mechanics, even a short-lived flash of light can significantly enhance the chance of successful movement through a dark room.
    “This is because visual memory can subsequently influence and aid the tactile responses for navigation,” Sadaf said. “This would not be possible if our visual and tactile cortex were to respond to their respective unimodal cues alone. We have a photo memory effect, where light shines and we can remember. We incorporated that ability into a device through a transistor that provides the same response.”
    The researchers fabricated the multisensory neuron by connecting a tactile sensor to a phototransistor based on a monolayer of molybdenum disulfide, a compound that exhibits unique electrical and optical characteristics useful for detecting light and supporting transistors. The sensor generates electrical spikes in a manner reminiscent of neurons processing information, allowing it to integrate both visual and tactile cues. More

  • in

    Groundbreaking soft valve technology enabling sensing and control integration in soft robots

    Soft inflatable robots have emerged as a promising paradigm for applications that require inherent safety and adaptability. However, the integration of sensing and control systems in these robots has posed significant challenges without compromising their softness, form factor, or capabilities. Addressing this obstacle, a research team jointly led by Professor Jiyun Kim (Department of New Material Engineering, UNIST) and Professor Jonbum Bae (Department of Mechanical Engineering, UNIST) has developed groundbreaking “soft valve” technology — an all-in-one solution that integrates sensors and control valves while maintaining complete softness.
    Traditionally, soft robot bodies coexisted with rigid electronic components for perception purposes. The study conducted by this research team introduces a novel approach to overcome this limitation by creating soft analogs of sensors and control valves that operate without electricity. The resulting tube-shaped part serves dual functions: detecting external stimuli and precisely controlling driving motion using only air pressure. By eliminating the need for electricity-dependent components, these all-soft valves enable safe operation underwater or in environments where sparks may pose risks — while simultaneously reducing weight burdens on robotic systems. Moreover, each component is inexpensive at approximately 800 Won.
    “Previous soft robots had flexible bodies but relied on hard electronic parts for stimulus detection sensors and drive control units,” explained Professor Kim. “Our study focuses on making both sensors and drive control parts using soft materials.”
    The research team showcased various applications utilizing this groundbreaking technology. They created universal tongs capable of delicately picking up fragile items such as potato chips — preventing breakage caused by excessive force exerted by conventional rigid robot hands. Additionally, they successfully employed these all-soft components to develop wearable elbow assist robots designed to reduce muscle burden caused by repetitive tasks or strenuous activities involving arm movements. The elbow support automatically adjusts according to the angle at which an individual’s arm is bent — a breakthrough contributing to a 63% average decrease in the force exerted on the elbow when wearing the robot.
    The soft valve operates by utilizing air flow within a tube-shaped structure. When tension is applied to one end of the tube, a helically wound thread inside compresses it, controlling inflow and outflow of air. This accordion-like motion allows for precise and flexible movements without relying on electrical power.
    Furthermore, the research team confirmed that by programming different structures or numbers of threads within the tube, they could accurately control airflow variations. This programmability enables customized adjustments to suit specific situations and requirements — providing flexibility in driving unit response even with consistent external forces applied to the end of the tube.
    “These newly developed components can be easily employed using material programming alone, eliminating electronic devices,” expressed Professor Bae with excitement about this development. “This breakthrough will significantly contribute to advancements in various wearable systems.”
    This groundbreaking soft valve technology marks a significant step toward fully soft, electronics-free robots capable of autonomous operation — a crucial milestone for enhancing safety and adaptability across numerous industries.
    Support for this work was provided by various organizations including Korea’s National Research Foundation (NRF), Korea Institute of Materials Science (KIMS), and Korea Evaluation Institute of Industrial Technology (KEIT). More

  • in

    ‘The Deepest Map’ explores the thrills — and dangers — of charting the ocean

    The Deepest MapLaura TretheweyHarper Wave, $32

    In 2019, the multimillionaire and explorer Victor Vescovo made headlines when he became the first person to visit the deepest parts of all five of Earth’s oceans. But arguably the real star of the expedition was marine geologist Cassie Bongiovanni, the lead ocean mapper who ensured Vescovo piloted his submersible to the actual deepest depths.

    Today, only 25 percent of the seafloor is well mapped. When Vescovo set out to score his record, the exact deepest location in each ocean was unknown. Bongiovanni, Vescovo and their crew had to chart these regions in detail before each dive.

    .email-conversion {
    border: 1px solid #ffcccb;
    color: white;
    margin-top: 50px;
    background-image: url(“/wp-content/themes/sciencenews/client/src/images/cta-module@2x.jpg”);
    padding: 20px;
    clear: both;
    }

    .zephr-registration-form{max-width:440px;margin:20px auto;padding:20px;background-color:#fff;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form *{box-sizing:border-box}.zephr-registration-form-text > *{color:var(–zephr-color-text-main)}.zephr-registration-form-relative-container{position:relative}.zephr-registration-form-flex-container{display:flex}.zephr-registration-form-input.svelte-blfh8x{display:block;width:100%;height:calc(var(–zephr-input-height) * 1px);padding-left:8px;font-size:16px;border:calc(var(–zephr-input-borderWidth) * 1px) solid var(–zephr-input-borderColor);border-radius:calc(var(–zephr-input-borderRadius) * 1px);transition:border-color 0.25s ease, box-shadow 0.25s ease;outline:0;color:var(–zephr-color-text-main);background-color:#fff;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-input.svelte-blfh8x::placeholder{color:var(–zephr-color-background-tinted)}.zephr-registration-form-input-checkbox.svelte-blfh8x{width:auto;height:auto;margin:8px 5px 0 0;float:left}.zephr-registration-form-input-radio.svelte-blfh8x{position:absolute;opacity:0;cursor:pointer;height:0;width:0}.zephr-registration-form-input-color[type=”color”].svelte-blfh8x{width:50px;padding:0;border-radius:50%}.zephr-registration-form-input-color[type=”color”].svelte-blfh8x::-webkit-color-swatch{border:none;border-radius:50%;padding:0}.zephr-registration-form-input-color[type=”color”].svelte-blfh8x::-webkit-color-swatch-wrapper{border:none;border-radius:50%;padding:0}.zephr-registration-form-input.disabled.svelte-blfh8x,.zephr-registration-form-input.disabled.svelte-blfh8x:hover{border:calc(var(–zephr-input-borderWidth) * 1px) solid var(–zephr-input-borderColor);background-color:var(–zephr-color-background-tinted)}.zephr-registration-form-input.error.svelte-blfh8x{border:1px solid var(–zephr-color-warning-main)}.zephr-registration-form-input-label.svelte-1ok5fdj.svelte-1ok5fdj{margin-top:10px;display:block;line-height:30px;font-size:12px;color:var(–zephr-color-text-tinted);font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-input-label.svelte-1ok5fdj span.svelte-1ok5fdj{display:block}.zephr-registration-form-button.svelte-17g75t9{height:calc(var(–zephr-button-height) * 1px);line-height:0;padding:0 20px;text-decoration:none;text-transform:capitalize;text-align:center;border-radius:calc(var(–zephr-button-borderRadius) * 1px);font-size:calc(var(–zephr-button-fontSize) * 1px);font-weight:normal;cursor:pointer;border-style:solid;border-width:calc(var(–zephr-button-borderWidth) * 1px);border-color:var(–zephr-color-action-tinted);transition:backdrop-filter 0.2s, background-color 0.2s;margin-top:20px;display:block;width:100%;background-color:var(–zephr-color-action-main);color:#fff;position:relative;overflow:hidden;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-button.svelte-17g75t9:hover{background-color:var(–zephr-color-action-tinted);border-color:var(–zephr-color-action-tinted)}.zephr-registration-form-button.svelte-17g75t9:disabled{background-color:var(–zephr-color-background-tinted);border-color:var(–zephr-color-background-tinted)}.zephr-registration-form-button.svelte-17g75t9:disabled:hover{background-color:var(–zephr-color-background-tinted);border-color:var(–zephr-color-background-tinted)}.zephr-registration-form-text.svelte-i1fi5{font-size:19px;text-align:center;margin:20px auto;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-divider-container.svelte-mk4m8o{display:flex;align-items:center;justify-content:center;margin:40px 0}.zephr-registration-form-divider-line.svelte-mk4m8o{height:1px;width:50%;margin:0 5px;background-color:var(–zephr-color-text-tinted);;}.zephr-registration-form-divider-text.svelte-mk4m8o{margin:0 12px;color:var(–zephr-color-text-main);font-size:14px;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont);white-space:nowrap}.zephr-registration-form-input-inner-text.svelte-lvlpcn{cursor:pointer;position:absolute;top:50%;transform:translateY(-50%);right:10px;color:var(–zephr-color-text-main);font-size:12px;font-weight:bold;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-response-message.svelte-179421u{text-align:center;padding:10px 30px;border-radius:5px;font-size:15px;margin-top:10px;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-response-message-title.svelte-179421u{font-weight:bold;margin-bottom:10px}.zephr-registration-form-response-message-success.svelte-179421u{background-color:#baecbb;border:1px solid #00bc05}.zephr-registration-form-response-message-error.svelte-179421u{background-color:#fcdbec;border:1px solid #d90c00}.zephr-registration-form-social-sign-in.svelte-gp4ky7{align-items:center}.zephr-registration-form-social-sign-in-button.svelte-gp4ky7{height:55px;padding:0 15px;color:#000;background-color:#fff;box-shadow:0px 0px 5px rgba(0, 0, 0, 0.3);border-radius:10px;font-size:17px;display:flex;align-items:center;cursor:pointer;margin-top:20px;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-social-sign-in-button.svelte-gp4ky7:hover{background-color:#fafafa}.zephr-registration-form-social-sign-in-icon.svelte-gp4ky7{display:flex;justify-content:center;margin-right:30px;width:25px}.zephr-form-link-message.svelte-rt4jae{margin:10px 0 10px 20px;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-recaptcha-tcs.svelte-1wyy3bx{margin:20px 0 0 0;font-size:15px;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-recaptcha-inline.svelte-1wyy3bx{margin:20px 0 0 0}.zephr-registration-form-progress-bar.svelte-8qyhcl{width:100%;border:0;border-radius:20px;margin-top:10px}.zephr-registration-form-progress-bar.svelte-8qyhcl::-webkit-progress-bar{background-color:var(–zephr-color-background-tinted);border:0;border-radius:20px}.zephr-registration-form-progress-bar.svelte-8qyhcl::-webkit-progress-value{background-color:var(–zephr-color-text-tinted);border:0;border-radius:20px}.zephr-registration-progress-bar-step.svelte-8qyhcl{margin:auto;color:var(–zephr-color-text-tinted);font-size:12px;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-progress-bar-step.svelte-8qyhcl:first-child{margin-left:0}.zephr-registration-progress-bar-step.svelte-8qyhcl:last-child{margin-right:0}.zephr-registration-form-input-error-text.svelte-19a73pq{color:var(–zephr-color-warning-main);font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-input-select.svelte-19a73pq{display:block;appearance:auto;width:100%;height:calc(var(–zephr-input-height) * 1px);font-size:16px;border:calc(var(–zephr-input-borderWidth) * 1px) solid var(–zephr-color-text-main);border-radius:calc(var(–zephr-input-borderRadius) * 1px);transition:border-color 0.25s ease, box-shadow 0.25s ease;outline:0;color:var(–zephr-color-text-main);background-color:#fff;padding:10px}.zephr-registration-form-input-select.disabled.svelte-19a73pq{border:1px solid var(–zephr-color-background-tinted)}.zephr-registration-form-input-select.unselected.svelte-19a73pq{color:var(–zephr-color-background-tinted)}.zephr-registration-form-input-select.error.svelte-19a73pq{border-color:var(–zephr-color-warning-main)}.zephr-registration-form-input-textarea.svelte-19a73pq{background-color:#fff;border:1px solid #ddd;color:#222;font-size:14px;font-weight:300;padding:16px;width:100%}.zephr-registration-form-input-slider-output.svelte-19a73pq{margin:13px 0 0 10px}.zephr-registration-form-input-inner-text.svelte-lvlpcn{cursor:pointer;position:absolute;top:50%;transform:translateY(-50%);right:10px;color:var(–zephr-color-text-main);font-size:12px;font-weight:bold;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.spin.svelte-1cj2gr0{animation:svelte-1cj2gr0-spin 2s 0s infinite linear}.pulse.svelte-1cj2gr0{animation:svelte-1cj2gr0-spin 1s infinite steps(8)}@keyframes svelte-1cj2gr0-spin{0%{transform:rotate(0deg)}100%{transform:rotate(360deg)}}.zephr-registration-form-checkbox.svelte-1gzpw2y{position:absolute;opacity:0;cursor:pointer;height:0;width:0}.zephr-registration-form-checkbox-label.svelte-1gzpw2y{display:flex;align-items:center;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-checkmark.svelte-1gzpw2y{position:relative;box-sizing:border-box;height:23px;width:23px;background-color:#fff;border:1px solid var(–zephr-color-text-main);border-radius:6px;margin-right:12px;cursor:pointer}.zephr-registration-form-checkmark.checked.svelte-1gzpw2y{border-color:#009fe3}.zephr-registration-form-checkmark.checked.svelte-1gzpw2y:after{content:””;position:absolute;width:6px;height:13px;border:solid #009fe3;border-width:0 2px 2px 0;transform:rotate(45deg);top:3px;left:8px;box-sizing:border-box}.zephr-registration-form-checkmark.disabled.svelte-1gzpw2y{border:1px solid var(–zephr-color-background-tinted)}.zephr-registration-form-checkmark.disabled.checked.svelte-1gzpw2y:after{border:solid var(–zephr-color-background-tinted);border-width:0 2px 2px 0}.zephr-registration-form-checkmark.error.svelte-1gzpw2y{border:1px solid var(–zephr-color-warning-main)}.zephr-registration-form-input-radio.svelte-1qn5n0t{position:absolute;opacity:0;cursor:pointer;height:0;width:0}.zephr-registration-form-radio-label.svelte-1qn5n0t{display:flex;align-items:center;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-radio-dot.svelte-1qn5n0t{position:relative;box-sizing:border-box;height:23px;width:23px;background-color:#fff;border:1px solid #ebebeb;border-radius:50%;margin-right:12px}.checked.svelte-1qn5n0t{border-color:#009fe3}.checked.svelte-1qn5n0t:after{content:””;position:absolute;width:17px;height:17px;background:#009fe3;background:linear-gradient(#009fe3, #006cb5);border-radius:50%;top:2px;left:2px}.disabled.checked.svelte-1qn5n0t:after{background:var(–zephr-color-background-tinted)}.error.svelte-1qn5n0t{border:1px solid var(–zephr-color-warning-main)}.zephr-form-link.svelte-64wplc{margin:10px 0;color:#6ba5e9;text-decoration:underline;cursor:pointer;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-form-link-disabled.svelte-64wplc{color:var(–zephr-color-text-main);cursor:none;text-decoration:none}.zephr-registration-form-google-icon.svelte-1jnblvg{width:20px}.zephr-registration-form-password-progress.svelte-d1zv9r{display:flex;margin-top:10px}.zephr-registration-form-password-bar.svelte-d1zv9r{width:100%;height:4px;border-radius:2px}.zephr-registration-form-password-bar.svelte-d1zv9r:not(:first-child){margin-left:8px}.zephr-registration-form-password-requirements.svelte-d1zv9r{margin:20px 0;justify-content:center}.zephr-registration-form-password-requirement.svelte-d1zv9r{display:flex;align-items:center;color:var(–zephr-color-text-tinted);font-size:12px;height:20px;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-password-requirement-icon.svelte-d1zv9r{margin-right:10px;font-size:15px}.zephr-registration-form-password-progress.svelte-d1zv9r{display:flex;margin-top:10px}.zephr-registration-form-password-bar.svelte-d1zv9r{width:100%;height:4px;border-radius:2px}.zephr-registration-form-password-bar.svelte-d1zv9r:not(:first-child){margin-left:8px}.zephr-registration-form-password-requirements.svelte-d1zv9r{margin:20px 0;justify-content:center}.zephr-registration-form-password-requirement.svelte-d1zv9r{display:flex;align-items:center;color:var(–zephr-color-text-tinted);font-size:12px;height:20px;font-family:var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont)}.zephr-registration-form-password-requirement-icon.svelte-d1zv9r{margin-right:10px;font-size:15px}
    .zephr-registration-form {
    max-width: 100%;
    background-image: url(/wp-content/themes/sciencenews/client/src/images/cta-module@2x.jpg);
    font-family: var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont);
    margin: 0px auto;
    margin-bottom: 4rem;
    padding: 20px;
    }

    .zephr-registration-form-text h6 {
    font-size: 0.8rem;
    }

    .zephr-registration-form h4 {
    font-size: 3rem;
    }

    .zephr-registration-form h4 {
    font-size: 1.5rem;
    }

    .zephr-registration-form-button.svelte-17g75t9:hover {
    background-color: #fc6a65;
    border-color: #fc6a65;
    width: 150px;
    margin-left: auto;
    margin-right: auto;
    }
    .zephr-registration-form-button.svelte-17g75t9:disabled {
    background-color: #e04821;
    border-color: #e04821;
    width: 150px;
    margin-left: auto;
    margin-right: auto;
    }
    .zephr-registration-form-button.svelte-17g75t9 {
    background-color: #e04821;
    border-color: #e04821;
    width: 150px;
    margin-left: auto;
    margin-right: auto;
    }
    .zephr-registration-form-text > * {
    color: #FFFFFF;
    font-weight: bold
    font: 25px;
    }
    .zephr-registration-form-progress-bar.svelte-8qyhcl {
    width: 100%;
    border: 0;
    border-radius: 20px;
    margin-top: 10px;
    display: none;
    }
    .zephr-registration-form-response-message-title.svelte-179421u {
    font-weight: bold;
    margin-bottom: 10px;
    display: none;
    }
    .zephr-registration-form-response-message-success.svelte-179421u {
    background-color: #8db869;
    border: 1px solid #8db869;
    color: white;
    margin-top: -0.2rem;
    }
    .zephr-registration-form-text.svelte-i1fi5:nth-child(1){
    font-size: 18px;
    text-align: center;
    margin: 20px auto;
    font-family: var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont);
    color: white;
    }
    .zephr-registration-form-text.svelte-i1fi5:nth-child(5){
    font-size: 18px;
    text-align: left;
    margin: 20px auto;
    font-family: var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont);
    color: white;
    }
    .zephr-registration-form-text.svelte-i1fi5:nth-child(7){
    font-size: 18px;
    text-align: left;
    margin: 20px auto;
    font-family: var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont);
    color: white;
    }
    .zephr-registration-form-text.svelte-i1fi5:nth-child(9){
    font-size: 18px;
    text-align: left;
    margin: 20px auto;
    font-family: var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont);
    color: white;
    }
    .zephr-registration-form-input-label.svelte-1ok5fdj span.svelte-1ok5fdj {
    display: none;
    color: white;
    }
    .zephr-registration-form-input.disabled.svelte-blfh8x, .zephr-registration-form-input.disabled.svelte-blfh8x:hover {
    border: calc(var(–zephr-input-borderWidth) * 1px) solid var(–zephr-input-borderColor);
    background-color: white;
    }
    .zephr-registration-form-checkbox-label.svelte-1gzpw2y {
    display: flex;
    align-items: center;
    font-family: var(–zephr-typography-body-font), var(–zephr-typography-body-fallbackFont);
    color: white;
    font-size: 20px;
    margin-bottom: -20px;
    }

    “Traditionally, captains never cared about the seafloor as long as it stayed far enough away from the hulls of their ships,” journalist Laura Trethewey writes in The Deepest Map. The book explores humankind’s quest to map the seafloor, framed around Bongiovanni’s adventures.

    Seafloor topography has been a big concern for militaries patrolling Neptunian frontiers with nuclear submarines and companies facilitating intercontinental communication via subsea cables (SN: 4/10/21, p. 28). In recent decades, seafloor data have become crucial to the deep-sea mining industries searching for metals needed to produce green technology.

    Satellites have revealed many of the knobs and crevices visible in the deep blue of Google Maps. But with that relatively coarse information, entire mountains can be missed. To see the seafloor in high resolution requires a sophisticated sonar system aboard a big ship that sends sound signals from the sea surface into the abyss.

    Mappers like Bongiovanni calculate depth from the time it takes for the signal to travel down and bounce back to the surface. These state-of-the-art sonar systems transform “the satellite-predicted blur into a sharp three-dimensional terrain of ripples, cracks and tears in the seafloor,” Trethewey writes. “The seafloor is ‘heard,’ rather than seen.”

    Through Trethewey’s tale, she twines stories of tagging along with scientists and ocean mappers. That includes her inaugural adventure at sea, which a crew member noted was “pretty rough for a first-timer,” as he and Trethewey clung to a doorframe in near gale force winds. On this cruise aboard research vessel E/V Nautilus, which was surveying a poorly mapped stretch of California’s coast, Trethewey (and readers) are introduced to the art and science of seafloor mapping. On this day, Trethewey learned that mapping is especially difficult — and sometimes impossible — when the ocean is angry.

    Trethewey’s insightful writing helps readers understand just why mapping the ocean — even in shallow coastal waters — is crucial to so many endeavors. She visits a remote Inuit village on the western bank of Canada’s Hudson Bay, where she joins hunters who map ever-changing coastlines for their own safety. Later, she scuba dives with archaeologists in Florida who use underwater maps to explore remnants of early human history that have been submerged for thousands of years.

    A distant, possibly unreachable goal envisions creating a complete map of the entire seafloor by the end of this decade, an effort known as Seabed 2030. Because the oceans are vast and replete with remote and dangerous places that people simply can’t or shouldn’t go, this effort will almost certainly require autonomous surface vehicles armed with sonar. Such devices are already probing the depths and sending back data.

    Staring at computer screens in a sun-filled conference room, Trethewey watches as a drone outfitted with cameras, environmental sensors and a sonar system maps a bit of seafloor off California as she sips her coffee. “The future of ocean mapping weirdly felt a lot like checking social media or doing anything else on your phone these days,” she wryly observes.

    Trethewey’s book is about more than just mapping the oceans. It’s also about what can go wrong when explorers explore. It’s hard to read The Deepest Map without being reminded of the recent implosion of the Titan submersible in the North Atlantic that killed everyone on board in June. Indeed, Trethewey describes how, during Vescovo’s first solo dive, his colleagues endured 25 minutes of apprehension-turned-alarm when they didn’t hear from him.

    She also reminds us how easily exploration can turn into exploitation. In the not-so-distant past, Europeans “discovered” the so-called New World and mapped it, Trethewey writes. Exploitation followed. Scientists and environmentalists alike are now concerned that a full, detailed map of the ocean floor might lead to the destruction of delicate, mostly unknown habitats if deep-sea miners are allowed to extract metals.

    .subscribe-cta {
    color: black;
    margin-top: 0px;
    background-image: url(“”);
    background-size: cover;
    padding: 20px;
    border: 1px solid #ffcccb;
    border-top: 5px solid #e04821;
    clear: both;
    }

    Subscribe to Science News

    Get great science journalism, from the most trusted source, delivered to your doorstep.

    Trethewey envisions a different outcome. Seabed 2030’s mapping effort may help people see that “the weird, wonderful deep-sea world is not a blank space, another frontier to use up and throw away,” and should be safeguarded for scientists “to uncover our past and protect our future.”

    Buy The Deepest Map from Bookshop.org. Science News is a Bookshop.org affiliate and will earn a commission on purchases made from links in this article. More

  • in

    Are US teenagers more likely than others to exaggerate their math abilities?

    A major new study has revealed that American teenagers are more likely than any other nationality to brag about their math ability.
    Research using data from 40,000 15-year-olds from nine English-speaking nations internationally found those in North America were the most likely to exaggerate their mathematical knowledge, while those in Ireland and Scotland were least likely to do so.
    The study, published in the peer-reviewed journal Assessment in Education: Principles, Policy & Practice, used responses from the OECD Programme for International Student Assessment (PISA), in which participants took a two-hour maths test alongside a 30-minute background questionnaire.
    They were asked how familiar they were with each of 16 mathematical terms — but three of the terms were fake.
    Further questions revealed those who claimed familiarity with non-existent mathematical concepts were also more likely to display overconfidence in their academic prowess, problem-solving skills and perseverance.
    For instance, they claimed higher levels of competence in calculating a discount on a television and in finding their way to a destination. Two thirds of those most likely to overestimate their mathematical ability were confident they could work out the petrol consumption of a car, compared to just 40 per cent of those least likely to do so.
    Those likely to over-claim were also more likely to say if their mobile phone stopped sending texts they would consult a manual (41 per cent versus 30 per cent) while those less likely to do so tended to say they would react by pressing all the buttons (56 per cent versus 49 per cent). More

  • in

    AI-driven tool makes it easy to personalize 3D-printable models

    As 3D printers have become cheaper and more widely accessible, a rapidly growing community of novice makers are fabricating their own objects. To do this, many of these amateur artisans access free, open-source repositories of user-generated 3D models that they download and fabricate on their 3D printer.
    But adding custom design elements to these models poses a steep challenge for many makers, since it requires the use of complex and expensive computer-aided design (CAD) software, and is especially difficult if the original representation of the model is not available online. Plus, even if a user is able to add personalized elements to an object, ensuring those customizations don’t hurt the object’s functionality requires an additional level of domain expertise that many novice makers lack.
    To help makers overcome these challenges, MIT researchers developed a generative-AI-driven tool that enables the user to add custom design elements to 3D models without compromising the functionality of the fabricated objects. A designer could utilize this tool, called Style2Fab, to personalize 3D models of objects using only natural language prompts to describe their desired design. The user could then fabricate the objects with a 3D printer.
    “For someone with less experience, the essential problem they faced has been: Now that they have downloaded a model, as soon as they want to make any changes to it, they are at a loss and don’t know what to do. Style2Fab would make it very easy to stylize and print a 3D model, but also experiment and learn while doing it,” says Faraz Faruqi, a computer science graduate student and lead author of a paper introducing Style2Fab.
    Style2Fab is driven by deep-learning algorithms that automatically partition the model into aesthetic and functional segments, streamlining the design process.
    In addition to empowering novice designers and making 3D printing more accessible, Style2Fab could also be utilized in the emerging area of medical making. Research has shown that considering both the aesthetic and functional features of an assistive device increases the likelihood a patient will use it, but clinicians and patients may not have the expertise to personalize 3D-printable models.
    With Style2Fab, a user could customize the appearance of a thumb splint so it blends in with her clothing without altering the functionality of the medical device, for instance. Providing a user-friendly tool for the growing area of DIY assistive technology was a major motivation for this work, adds Faruqi. More

  • in

    Verbal nonsense reveals limitations of AI chatbots

    The era of artificial-intelligence chatbots that seem to understand and use language the way we humans do has begun. Under the hood, these chatbots use large language models, a particular kind of neural network. But a new study shows that large language models remain vulnerable to mistaking nonsense for natural language. To a team of researchers at Columbia University, it’s a flaw that might point toward ways to improve chatbot performance and help reveal how humans process language.
    In a paper published online today in Nature Machine Intelligence, the scientists describe how they challenged nine different language models with hundreds of pairs of sentences. For each pair, people who participated in the study picked which of the two sentences they thought was more natural, meaning that it was more likely to be read or heard in everyday life. The researchers then tested the models to see if they would rate each sentence pair the same way the humans had.
    In head-to-head tests, more sophisticated AIs based on what researchers refer to as transformer neural networks tended to perform better than simpler recurrent neural network models and statistical models that just tally the frequency of word pairs found on the internet or in online databases. But all the models made mistakes, sometimes choosing sentences that sound like nonsense to a human ear.
    “That some of the large language models perform as well as they do suggests that they capture something important that the simpler models are missing,” said Dr. Nikolaus Kriegeskorte, PhD, a principal investigator at Columbia’s Zuckerman Institute and a coauthor on the paper. “That even the best models we studied still can be fooled by nonsense sentences shows that their computations are missing something about the way humans process language.”
    Consider the following sentence pair that both human participants and the AI’s assessed in the study:
    That is the narrative we have been sold.
    This is the week you have been dying. More

  • in

    New camera offers ultrafast imaging at a fraction of the normal cost

    Capturing blur-free images of fast movements like falling water droplets or molecular interactions requires expensive ultrafast cameras that acquire millions of images per second. In a new paper, researchers report a camera that could offer a much less expensive way to achieve ultrafast imaging for a wide range of applications such as real-time monitoring of drug delivery or high-speed lidar systems for autonomous driving.
    “Our camera uses a completely new method to achieve high-speed imaging,” said Jinyang Liang from the Institut national de la recherche scientifique (INRS) in Canada. “It has an imaging speed and spatial resolution similar to commercial high-speed cameras but uses off-the-shelf components that would likely cost less than a tenth of today’s ultrafast cameras, which can start at close to $100,000.”
    In Optica, Optica Publishing Group’s journal for high-impact research, Liang together with collaborators from Concordia University in Canada and Meta Platforms Inc. show that their new diffraction-gated real-time ultrahigh-speed mapping (DRUM) camera can capture a dynamic event in a single exposure at 4.8 million frames per second. They demonstrate this capability by imaging the fast dynamics of femtosecond laser pulses interacting with liquid and laser ablation in biological samples.
    “In the long term, I believe that DRUM photography will contribute to advances in biomedicine and automation-enabling technologies such as lidar, where faster imaging would allow more accurate sensing of hazards,” said Liang. “However, the paradigm of DRUM photography is quite generic. In theory, it can be used with any CCD and CMOS cameras without degrading their other advantages such as high sensitivity.”
    Creating a better ultrafast camera
    Despite a great deal of progress in ultrafast imaging, today’s methods are still expensive and complex to implement. Their performance is also limited by trade-offs between the number of frames captured in each movie and light throughput or temporal resolution. To overcome these issues, the researchers developed a new time-gating method known as time-varying optical diffraction.
    Cameras use gates to control when light hits the sensor. For example, the shutter in a traditional camera is a type of gate that opens and closes once. In time-gating, the gate is opened and closed in quick succession a certain number of times before the sensor reads out the image. This captures a short high-speed movie of a scene. More

  • in

    Evolution wired human brains to act like supercomputers

    Scientists have confirmed that human brains are naturally wired to perform advanced calculations, much like a high-powered computer, to make sense of the world through a process known as Bayesian inference.
    In a study published in the journal Nature Communications, researchers from the University of Sydney, University of Queensland and University of Cambridge developed a specific mathematical model that closely matches how human brains work when it comes to reading vision. The model contained everything needed to carry out Bayesian inference.
    Bayesian inference is a statistical method that combines prior knowledge with new evidence to make intelligent guesswork. For example, if you know what a dog looks like and you see a furry animal with four legs, you might use your prior knowledge to guess it’s a dog.
    This inherent capability enables people to interpret the environment with extraordinary precision and speed, unlike machines that can be bested by simple CAPTCHA security measures when prompted to identify fire hydrants in a panel of images.
    The study’s senior investigator Dr Reuben Rideaux, from the University of Sydney’s School of Psychology, said: “Despite the conceptual appeal and explanatory power of the Bayesian approach, how the brain calculates probabilities is largely mysterious.”
    “Our new study sheds light on this mystery. We discovered that the basic structure and connections within our brain’s visual system are set up in a way that allows it to perform Bayesian inference on the sensory data it receives.
    “What makes this finding significant is the confirmation that our brains have an inherent design that allows this advanced form of processing, enabling us to interpret our surroundings more effectively.”
    The study’s findings not only confirm existing theories about the brain’s use of Bayesian-like inference but open doors to new research and innovation, where the brain’s natural ability for Bayesian inference can be harnessed for practical applications that benefit society. More