Adaptive wavelet classification of acoustic backscatter and imagery

Brian A. Telfer, Harold H. Szu, Gerald J. Dobeck, Joseph P. Garcia, Hanseok Ko, Abinash C. Dubey, Ned H. Witherspoon

Research output: Contribution to journalArticlepeer-review

8 Citations (Scopus)

Abstract

The utility and robustness of wavelet features is demonstrated through three practical case studies of detecting objects in multispectral electro-optical imagery, sidescan sonar imagery, and acoustic backscatter. Attention is given to choosing proper waveforms for particular applications. Using artificial neural networks (ANNs), evidence is fused from multiple-waveform types that detect local features. The wavelet waveforms and their dilation and shift parameters are adaptively computed with ANNs to maximize classification accuracy. Emphasis is placed on the acoustic backscatter case study, involving detecting a metallic man-made object from natural and synthetic specular clutter with reverberation noise. The synthetic clutter is shown to be a good model for the natural clutter and for better delineating the classification boundary. The classifier computes the locations, sizes, and weights of Gaussian patches in time-scale space that contains the most discriminatory information. This new approach is shown to give higher classifications rates than an ANN with commonly used power spectral features. The new approach also reduces the number of free parameters in the classifier based on all wavelet features, which leads to simpler implementation for applications and to potentially better generalization to test data.

Original languageEnglish
Pages (from-to)2192-2203
Number of pages12
JournalOptical Engineering
Volume33
Issue number7
DOIs
Publication statusPublished - 1994
Externally publishedYes

ASJC Scopus subject areas

  • Atomic and Molecular Physics, and Optics
  • Engineering(all)

Fingerprint

Dive into the research topics of 'Adaptive wavelet classification of acoustic backscatter and imagery'. Together they form a unique fingerprint.

Cite this