Combined Kohonen neural networks and discrete cosine transform method for iterated transformation theory
作者:
Highlights:
•
摘要
Iterated transformation theory (ITT) coding, also known as fractal coding, in its original form, allows fast decoding but suffers from long encoding times. During the encoding step, a large number of block best-matching searches have to be performed which leads to a computationally expensive process. Because of that, most of the research efforts carried on this field are focused on speeding up the encoding algorithm. Many different methods and algorithms have been proposed, from simple classifying methods to multi-dimensional nearest key search. We present in this paper a new method that significantly reduces the computational load of ITT-based image coding. Both domain and range blocks of the image are transformed into the frequency domain (which has proven to be more appropriate for ITT coding). Domain blocks are then used to train a two-dimensional Kohonen neural network (KNN) forming a codebook similar to vector quantization coding. The property of KNN (and self-organizing feature maps in general) which maintains the input space (transformed domain blocks) topology allows to perform a neighboring search to find the piecewise transformation between domain and range blocks.
论文关键词:Digital image compression,Iterated function systems,Fractal image compression,Kohonen neural nets,Discrete cosine transform,Vector quantization
论文评审过程:Received 3 August 1999, Revised 1 February 2000, Accepted 15 May 2000, Available online 26 March 2001.
论文官网地址:https://doi.org/10.1016/S0923-5965(00)00041-2