Creating Modular-like Ensembles by Output Clustering
In this paper we consider the possibility of replacing the output layer of Multi- Layer Perceptrons (MLPs) by local schemes when dealing with classification problems. In order to open the possibility of developing LMS-trainable models, and posterior adaptive schemes, we apply a trainable version of the classical k-Nearest Neighbour classifier (kNN) named kNN-Learning Vector Classifier. We develop the corresponding training formulas for the whole resulting structure and apply it to some classification benchmark problems. The experimental results give evidence of the nearly systematic advantage of our proposal with respect to MLPs, as well as of their competitive performance regarding the Modular Neural Networks (MNNs), which have a similar philosophy as our approach.