Wavelet based compression and denoising of optical tomography data

https://doi.org/10.1016/S0030-4018(99)00294-1Get rights and content

Abstract

Two methods based on wavelet/wavelet packet expansion to denoise and compress optical tomography data containing scattered noise are presented. In the first, the wavelet expansion coefficients of noisy data are shrunk using a soft threshold. In the second, the data are expanded into a wavelet packet tree upon which a best basis search is done. The resulting coefficients are truncated on the basis of energy content. It can be seen that the first method results in efficient denoising of experimental data when scattering particle density in the medium surrounding the object was up to 12.0×106 per cm3. This method achieves a compression ratio of ≈8:1. The wavelet packet based method resulted in a compression of up to 11:1 and also exhibited reasonable noise reduction capability. Tomographic reconstructions obtained from denoised data are presented.

Introduction

In the context of tomographic imaging there could be at least two occasions when wavelet analysis would be worthwhile attempting. One is when data compression is required as in the case of telemedical diagnosis where radiological data is centrally archived and transmitted to various centres for examination. Considering the enormous volume of imaging data that needs to be stored and transmitted compression becomes quite essential. The second context is when a signal buried in noisy data has to be recovered for a reasonable quality reconstruction. This is of particular significance in optical tomography, where the presence of highly scattering tissue surrounding an object of interest makes the output data very noisy. Several experimental methods which make use of tools such as time-gating 1, 2, spatial filtering 3, 4, heterodyne interferometry 5, 6 and polarization [7] were developed in the past to recover the signal from noisy data which relies on separating the ballistic or near-ballistic photons from the multiply scattered photons. Recovery of a signal from noisy data can also be done using transform-based methods relying on signal processing, which have proved their usefulness in recovering ECG data [8], astronomical spectra [9] and voice and picture signals transmitted through noisy channels [10]. Many of these noise reduction methods are based on wavelet expansion which provide an extra flexibility of time (or space) localization compared to Fourier expansion [11]. Donoho and Johnstone 12, 13 demonstrated a wavelet domain thresholding and shrinkage technique for noise reduction. The reason for recourse to such processing is the ability of wavelet expansion to concentrate signal energy to a small number of large coefficients. The coefficients that are below a threshold are dropped, and those that are small but above the threshold are shrunk. Hard thresholding alone results in a smaller mean square error (MSE) compared to an ideal estimate [14], while soft thresholding achieves smoother estimates with near minimum MSEs.

Donoho and Johnstone's 12, 13 and related methods gave excellent noise reduction when applied to NMR spectra [15] synthetic aperture radar signals [15] and astronomical data [9]. Their ability to compress signals was only through the number of small coefficients dropped owing to the hard threshold limit. Coifman and Wickerhauser [16] have proposed an entropy based method to select the best basis from the wavelet packet tree into which noisy data are expanded which was used to compress ECG data as well as finger print data bank [8]. In this work, we make use of both Wickerhauser's algorithm as well as Donoho and Johnstone's method to compress and denoise optical tomography data. In 2 Optical tomography, 3 Denoising by wavelet shrinkage, 4 Wavelet based compression and denoising using Wickerhauser's method, we briefly describe both Donoho's method of denoising and Wickerhauser's method of entropy minimization used generally for signal compression. For completeness, we also introduce optical tomography. In Section 5, we describe the experimental set-up for data collection from light transmitted across a fiber in a liquid medium containing scattering particles. The data are compressed and denoised using the methods described in 3 Denoising by wavelet shrinkage, 4 Wavelet based compression and denoising using Wickerhauser's method. The denoised and compressed data are compared with the data uncorrupted by noise to assess the performance of the algorithms of 3 Denoising by wavelet shrinkage, 4 Wavelet based compression and denoising using Wickerhauser's method. The processed data are reconstructed using the standard filtered backprojection (FBP) algorithm. Section 6gives our concluding remarks.

Section snippets

Optical tomography

In optical tomography, objects, which are refractive index distributions, are scanned by light whose time delay or intensity loss is measured. Time delay reconstructs the non-absorbing (real) part of refractive index distribution and the intensity of the absorbing (imaginary) part, if the data are properly inverted. Even under geometrical optics approximations, refraction cannot be neglected if the refractive index variations are not small. Incorporation of refraction correction methods is the

Denoising by wavelet shrinkage

This section is based on the work of Donoho and Johnstone 12, 13. We have a finite length signal of observations xi of a signal si corrupted by an i.i.d, zero mean, white Gaussian noise ni with standard deviation σ, i.e. xi=si+σni,where i=1,2,3,⋯N. We aim to recover si from xi. Let W and W−1 11, 14 represent an N×N wavelet transform matrix and its inverse, respectively. In the wavelet domain, Eq. (1)becomesX=S+N,whereX=Wx,S=WsandN=Wnand x, s and n are column vectors containing xi, si and ni,

Wavelet based compression and denoising using Wickerhauser's method

Wavelet packets are a generalization of wavelets. In the fast wavelet transform algorithm, the sampled data are passed through the scaling and wavelet filters [18] and down sampled, resulting in approximation and detail coefficients. The approximation coefficients are then filtered with the same scaling and wavelet filters, generating another set of detail and approximation coefficients. This process is continued until a desired level of decomposition is reached. This is Mallat's pyramidal

Experimental data collection and processing

The experimental set-up is shown in Fig. 3. The object, a 125 μm diameter (core diameter=50 μm) graded index single mode fiber with a transmission window at 830 nm was immersed in an index matching liquid in a cuvette C. This was trans-illuminated by a spatially noncoherent quasi-monochromatic light and the output wavefront at the exit plane, E, of the cuvette was imaged by a microscope objective onto a CCD array which was connected to a computer.

Light, which traveled through the object, was

Conclusions

We have brought out here the usefulness of two denoising algorithms to preprocess noisy optical tomography data. Both the algorithms fail when the scattering particle concentration is high, in which case an experimental method to capture the least scattered photons has to be used to separate the signal. Among the two denoising algorithms, it is notable that Donoho and Johnstone's method gave reasonably good results for comparatively higher noise levels, whereas Wickerhauser's method also

Acknowledgements

The support of DAAD (German Academic Exchange Service) to the first author for carrying out part of the work at University of Kaiserslautern is gratefully acknowledged. Comments from the referee have helped us correct a few mistakes in the original manuscript.

References (18)

  • J.C. Hebden et al.

    Appl. Opt.

    (1991)
  • M.R. Hee et al.

    Opt. Lett.

    (1993)
  • Q.Z. Wang et al.

    Opt. lett.

    (1995)
  • J.M. Schmitt et al.

    J. Opt. Soc. Am. A

    (1994)
  • K.P. Chau et al.

    Opt. Lett.

    (1995)
  • M. Kempe et al.

    J. Opt. Soc. Am. A

    (1996)
  • S.G. Demos et al.

    Opt. Lett.

    (1996)
  • B. Bradie

    IEEE Trans. Biomed. Eng.

    (1996)
  • M. Fligge et al.

    Astron. Astrophys. Suppl. Ser.

    (1997)
There are more references available in the full text version of this article.

Cited by (23)

  • A wavelet multi-scale method for the inverse problem of diffuse optical tomography

    2015, Journal of Computational and Applied Mathematics
    Citation Excerpt :

    But before presenting the work carried-out in the paper, some clarification is mandatory to fully understand the originality of the work. Indeed, although some papers found in the literature deal with the DOT problem using the wavelet tool such as [26–29], these works show significant differences with the current research: In [26], the wavelet transform is used to denoise and compress noisy experimental data before performing the reconstruction.

  • A Novel Method for non-contact measuring diameter parameters of wheelset based on wavelet analysis

    2012, Optik
    Citation Excerpt :

    One of the standard methods to remove the noise of the signal is the fast Fourier transform (FFT) which is the linear Fourier smoothing. Wavelet theory has been developed strongly in the past ten years and been applied in various research fields: the quality improvement of signals or images [14], and data mining in a limited number of images [15], and real-time 3D cardiac ultrasound [16], and analysis of electrocardiograms [17], and neuroscience [18], and noise reduction in astronomical spectra [19,20], and texture synthesis and modeling [21]. The wavelet method can obtain better results in the analysis of non-stationary or non-periodical signals than FFT.

  • Application of wavelet filtering techniques to Lau interferometric fringe analysis for measurement of small tilt angles

    2011, Optik
    Citation Excerpt :

    Excellent noise reduction property of wavelets in these areas was reported by the authors. Nath et al. [26] tested the algorithms incorporating wavelet based compression and denoising for preprocessing optical tomography reconstructions data. Reasonably good performance of wavelet based techniques in these applications has been reported.

  • Feature detection and monitoring of eddy current imaging data by means of wavelet based singularity analysis

    2010, NDT and E International
    Citation Excerpt :

    Because WT can provide a sequence of low-pass and high-pass sub-band filters, multi-level WT has been effectively used for de-noising breast imaging data, as a pre-processing step for subsequent volumetric breast segmentation [10]. Wavelet/wavelet packet approaches are also employed to de-noise and compress optical tomography data containing scattered noise [11]. Multi-scale WT are further used for 2D image de-noising to achieve significantly improved tomographic images [12,13].

View all citing articles on Scopus
View full text