Improving the Multilayer Perceptron Learning by using a Method to Calculate the Initial Weights with the Quality of Similarity Measure based on Fuzzy Sets and Particle Swarms

Lenniet Coello, Yumilka Fernandez, Yaima Filiberto, Rafael Bello


The most widely used neural network model is the Multilayer Perceptron (MLP), in which the connection weights training are normally completed by a Back Propagation learning algorithm. The good initial values of weights bear a fast convergence and a better generalization capability even with simple gradient-based error minimization techniques. This work presents a method to calculate the initial weights in order to train the Multilayer Perceptron Model. The method named PSO+RST+FUZZY is based on the quality of similarity measure proposed on the framework of the extended Rough Set Theory that employs fuzzy sets to characterize the domain of the similarity thresholds. Sensitivity of BP to initial weights with PSO+RST+FUZZY was studied experimentally, and shows a better performance than other methods used to calculate the weight of the feature.


Multilayer perceptron, weight initialization, quality of similarity Measure, fuzzy sets.

Full Text: PDF