Neural Networks Based Time-Delay Estimation using DCT Coefficients
This study dealt with the problem of estimating constant time delay embedded into a received signal that was noisy, delayed and damped image of a known reference signal. The received signal was filtered, normalized with respect to the peak value it achieved and then transformed by the Discrete Cosine Transform (DCT) into DCT coefficients. Those DCT coefficients that were most sensitive to time delay variations were selected and grouped to form the Reduced Discrete Cosine Transform Coefficients set (RDCTC). The time delays embedded in the filtered signals were efficiently encoded into those RDCTC sets. The RDCTC sets were applied to a pre trained multi layer feedforward Neural Network (NN), which computed the time-delay estimates. The network was initially trained with large sets of RDCTC vectors, in which each RDCTC vector corresponded to a signal delayed by a randomly selected constant time-delay. Using the RDCTC as input to the NN instead of the full length incoming signal itself resulted in a major reduction in the NN size. Accurate time delay estimates were obtained through simulation and compared against estimates obtained through classical cross-correlation technique.
Copyright: © 2009 Samir J. Shaltaf and Ahmad A. Mohammad. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
- 2,759 Views
- 2,149 Downloads
- 5 Citations
- Neural networks
- time-delay estimation
- discrete cosine transform