Semantic–Electromagnetic Inversion With Pretrained Multimodal Generative Model

Abstract: Across diverse domains of science and technology, electromagnetic (EM) inversion problems benefit from the ability to account for multimodal prior information to regularize their inherent ill‐posedness. Indeed, besides priors that are formulated mathematically or learned from quantitative data, valuable prior information may be available in the form of text or images. Besides handling semantic multimodality, it is furthermore important to minimize the cost of adapting to a new physical measurement operator and to limit the requirements for costly labeled data. Here, these challenges are tackled with a frugal and multimodal semantic–EM inversion technique. The key ingredient is a multimodal generator of reconstruction results that can be pretrained, being agnostic to the physical measurement operator. The generator is fed by a multimodal foundation model encoding the multimodal semantic prior and a physical adapter encoding the measured data. For a new physical setting, only the lightweight physical adapter is retrained. The authors’ architecture also enables a flexible iterative step‐by‐step solution to the inverse problem where each step can be semantically controlled. The feasibility and benefits of this methodology are demonstrated for three EM inverse problems: a canonical two‐dimensional inverse‐scattering problem in numerics, as well as three‐dimensional and four‐dimensional compressive microwave meta‐imaging experiments.

Standort
Deutsche Nationalbibliothek Frankfurt am Main
Umfang
Online-Ressource
Sprache
Englisch

Erschienen in
Semantic–Electromagnetic Inversion With Pretrained Multimodal Generative Model ; day:09 ; month:09 ; year:2024 ; extent:11
Advanced science ; (09.09.2024) (gesamt 11)

Urheber
Chen, Yanjin
Zhang, Hongrui
Ma, Jie
Cui, Tie Jun
del Hougne, Philipp
Li, Lianlin

DOI
10.1002/advs.202406793
URN
urn:nbn:de:101:1-2409091446565.637009587371
Rechteinformation
Open Access; Der Zugriff auf das Objekt ist unbeschränkt möglich.
Letzte Aktualisierung
15.08.2025, 07:38 MESZ

Datenpartner

Dieses Objekt wird bereitgestellt von:
Deutsche Nationalbibliothek. Bei Fragen zum Objekt wenden Sie sich bitte an den Datenpartner.

Beteiligte

  • Chen, Yanjin
  • Zhang, Hongrui
  • Ma, Jie
  • Cui, Tie Jun
  • del Hougne, Philipp
  • Li, Lianlin

Ähnliche Objekte (12)