Artificial vision systems find a wide range of applications, including self-driving cars, object detection, crop monitoring, and smart cameras. Such vision is often inspired by the vision of biological organisms. For instance, human and insect vision have inspired terrestrial artificial vision, while fish eyes have led to aquatic artificial vision. While the progress is remarkable, current artificial visions suffer from some limitations: they are not suitable for imaging both land and underwater environments, and are limited to a hemispherical (180°) field-of-view (FOV).
To overcome these issues, a group of researchers from Korea and USA, including Professor Young Min Song from Gwangju Institute of Science and Technology in Korea, have now designed a novel artificial vision system with an omnidirectional imaging ability, which can work in both aquatic and terrestrial environments. Their study was made available online on 12 July 2022 and published in Nature Electronics on 11 July 2022.
“Research in bio-inspired vision often results in a novel development that did not exist before. This, in turn, enables a deeper understanding of nature and ensure that the developed imaging device is both structurally and functionally effective,” says Prof. Song, explaining his motivation behind the study.
The inspiration for the system came from the fiddler crab (Uca arcuata), a semiterrestrial crab species with amphibious imaging ability and a 360° FOV. These remarkable features result from the ellipsoidal eye stalk of the fiddler crab’s compound eyes, enabling panoramic imaging, and flat corneas with a graded refractive index profile, allowing for amphibious imaging.
Accordingly, the researchers developed a vision system consisting of an array of flat micro-lenses with a graded refractive index profile that was integrated into a flexible comb-shaped silicon photodiode array and then mounted onto a spherical structure. The graded refractive index and the flat surface of the micro-lens were optimized to offset the defocusing effects due to changes in the external environment. Put simply, light rays traveling in different mediums (corresponding to different refractive indices) were made to focus at the same spot.
To test the capabilities of their system, the team performed optical simulations and imaging demonstrations in air and water. Amphibious imaging was performed by immersing the device halfway in water. To their delight, the images produced by the system were clear and free of distortions. The team further showed that the system had a panoramic visual field, 300o horizontally and 160o vertically, in both air and water. Additionally, the spherical mount was only 2 cm in diameter, making the system compact and portable.
“Our vision system could pave the way for 360° omnidirectional cameras with applications in virtual or augmented reality or an all-weather vision for autonomous vehicles,” speculates Prof. Song excitedly.
Story Source:
Materials provided by GIST (Gwangju Institute of Science and Technology). Note: Content may be edited for style and length.