A deep learning formulation of elastic FWI with numerical and parameterization analysis
Tianze Zhang, Kristopher A. Innanen, Jian Sun, Daniel O. Trad
In this paper we extend recent work on formulating seismic full waveform inversion within a deep learning environment. We are motivated both by the possibilities of incorporating training of multiple datasets with the relatively low dimensionality of a theory-guided network design, and by the fact that by doing so we implement an FWI algorithm ready-made for new computational architectures. A recurrent neural network is set up with rules enforcing elastic wave propagation, with the wavefield projected onto a measurement surface acting as the labelled data to be compared with observed seismic data. Training this network amounts to carrying out elastic FWI. Based on the Automatic Differential method, the exact gradient can be constructed by inspection and use of the computational graph, a gradient which acts to update the elastic model. We prepare our approach to mitigate cross-talk, which is a general property of multiparameter full waveform inversion algorithms, by allowing relative freedom to vary the eFWI parameterizations. The influence of random noise has also been examined.