On Problem-Oriented Kernel Refining
Much attention has been recently devoted to those machine learning procedures known as kernel methods, the Support Vector Machines being an instance of them. Their performance heavily depends on the particular 'distance measurement' between patterns, function also known as 'kernel', which represents a dot product in a projection space. Although some attempts are being made to 'a priori' decide which kernel function is more suitable for a problem, no defnite solution for this taskhas been found yet, since choosing the best kernel very often reduces to a selection among diferent possibilities by a cross-validation process. In this paper, we propose a method for solving classification problems relying on the ad hoc determination of a kernel for every problem at hand, i.e., a problem-oriented kernel design method. We iteratively obtain a semiparametric projecting function of the input data into a space which has an appropriately low dimension to avoid both overfitting and complexity explosion of the resulting machine, but being powerful enough to solve the classification problems with good accuracy. The performance of the proposed method is illustrated using standard databases, and we further discuss its suitability for developing problem-oriented feature extraction procedures.