Brought to you by:

Comparisons of parameter choice methods for regularization with discrete noisy data

Published under licence by IOP Publishing Ltd
, , Citation Mark A Lukas 1998 Inverse Problems 14 161 DOI 10.1088/0266-5611/14/1/014

0266-5611/14/1/161

Abstract

Several prominent methods have been developed for the crucial selection of the parameter in regularization of linear ill-posed problems with discrete, noisy data. The discrepancy principle (DP), minimum bound (MB) method and generalized cross-validation (GCV) are known to be at least weakly asymptotically optimal with respect to appropriate loss functions as the number n of data points approaches infinity. We compare these methods in three other ways. First, n is taken to be fixed and, using a discrete Picard condition, upper and lower bounds on the `expected' DP and MB estimates are derived in terms of the optimal parameters with respect to the risk and expected error. Next, we define a simple measure of the variability of a practical estimate and, for each of the five methods, determine its asymptotic behaviour. The results are that the asymptotic stability of GCV is the same as for the unbiased risk method and is superior to that of DP, which is better than for MB and an unbiased error method. Finally, the results of numerical simulations of the five methods demonstrate that the theoretical conclusions hold in practice.

Export citation and abstract BibTeX RIS

Please wait… references are loading.
10.1088/0266-5611/14/1/014