Deep leather and metamaterials make it invisibly visible


Deep leather and metamaterials make it invisibly visible

Credit: Bakhtiyar Orazbayev / EPFL

By combining purpose-built materials and neural networks, researchers at EPFL have shown that sound can be used in high-resolution imaging.


Imaging allows us to image an object through field analysis of the light and sound waves that it transmits or emits. The shorter the wave, the higher the resolution of the image. However, the level of detail is limited by the size of the wavelength in question – so far. Researchers at EPFL’s Laboratory of Wave Engineering have successfully proven that a long, and therefore unpretentious, wave (in this case a sound wave) can assume details that are 30 times smaller than its length. To achieve this, the research team used a combination of metamaterials – specifically-constructed elements – and artificial intelligence. Their research, which has just been published in Physical Review X, creates exciting new opportunities, particularly in the fields of medical imaging and bioengineering.

The team’s groundbreaking idea was to bring together two separate technologies that had previously pushed the boundaries of imaging. One of these is metamaterials: purpose-built elements that can focus wavelengths precisely, for example. That said, they are known to lose their effectiveness by accidentally picking up signals in a way that makes them difficult to decrypt. The other is artificial intelligence, or more specifically neural networks that can process even the most complex information quickly and efficiently, even though it involves a learning curve.

To review what is known in physics as the diffraction boundary, the research team – led by Romain Fleury – conducted the following experiment: they first created a grid of 64 miniature speakers, each of which could be activated according to the pixels in a image. Then they used the grid to reproduce sound images of figures from zero to nine with very precise spatial details; the images of figures in the grid were drawn from a database of about 70,000 handwritten examples. Across the grid, the researchers placed a bag containing 39 Helmholtz resonators (10 cm spheres with a hole at one end) that formed a metamaterial. The sound produced by the grating was transmitted through the metamaterial and captured by four microphones that were placed several meters away. Algorithms then decipher the sound recorded by the microphones to learn how to recognize and draw the original number images.

An advantageous disadvantage

The team achieved almost 90% success with their experiment. “By generating images with a resolution of only a few centimeters – using a sound wave whose length was about a meter – we have moved well beyond the diffraction limit,” says Romain Fleury. “In addition, the tendency of metamaterials to record signals, which was considered a major disadvantage, appears to be an advantage when neural networks are involved. We found that they work better when there is a lot of absorption.”

In the field of medical imaging, long waves to see very small objects could be a major breakthrough. “Long waves mean that doctors can use much lower frequencies, which results in acoustic imaging methods that are effective, even through dense bone tissue. When it comes to imaging using electromagnetic waves, long waves are less dangerous to a person’s health. “For these types of applications, we would not train neural networks to recognize or reproduce numbers, but rather organic structures,” says Fleury.


New metamaterial manipulates sound to improve acoustic imaging


More information:
Bakhtiyar Orazbayev et al. Far-Field Subwavelength Acoustic Imaging by Deep Learning, Physical Review X (2020). DOI: 10.1103 / PhysRevX.10.031029

Delivered by Ecole Polytechnique Federale de Lausanne

Citation: Deep learning and metamaterials make the invisible visible (2020, August 10) Retrieved August 10, 2020 from https://phys.org/news/2020-08-deep-metamaterials-invisible-visible.html

This document is subject to copyright. Except for any fair treatment for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for informational purposes only.