A numerical comparison of seismic inversion, multilayer and basis function neural networks

Brian H. Russell, Laurence R. Lines

In this presentation, a numerical example is used to illustrate the difference between geophysical inversion and several machine learning approaches to inversion. The results will show that, like inverse geophysical solutions, machine learning algorithms have a definite mathematical structure that can be written down and analyzed. The example used in this study is the extraction of the reflection coefficients from a synthetic created by convolving a dipole reflectivity with a symmetric three point wavelet. This simple example leads to the topics of deconvolution, recursive inversion, linear regression and nonlinear regression using several machine learning techniques. The first machine learning method discussed is the multi-layer feedforward neural network (MLFN) with a single hidden layer consisting of two neurons. The other two methods which are discussed are the radial basis function neural network (RBFN) and the generalized regression neural network (GRNN). As will be shown, all three of these networks involve different basis functions. In the MLFN the basis function is the sigmoidal logistic function and in both RBFN and GRNN the basis function is the Gaussian. However, for RBFN the weights are computed using a least-squares algorithm and in GRNN the weights are computed “on-the-fly” using the observed data. The MLFN algorithm is iterative and the key parameters are the initial random weights, the learning rate and the number of iterations. The RBFN and GRNN are not iterative and the key parameter in both methods is the width of the Gaussian, or sigma factor. Figures 1 (a) to (d) below show a comparison of the results of linear regression and the three machine learning algorithms. In all four figures, the horizontal axis represents the computed seismic amplitudes and the vertical axis represents the desired reflection coefficients. The black circles show the four training values, and the solid line shows the fitting function from the linear or nonlinear regressions. For the MLFN result, 10,000 iterations and a learning rate of 0.2 were used, and for both the RBFN and GRNN results a sigma factor of 0.5 was used.