Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

A novel hybrid global optimization (GO) algorithm applied for feedforward neural networks (NNs) supervised learning is investigated. The network weights are determined by minimizing the traditional mean square error function. The optimization technique, called LP(tau)NM, combines a novel global heuristic search based on LPtau low-discrepancy sequences of points, and a simplex local search. The proposed method is initially tested on multimodal mathematical functions and subsequently applied for training moderate size NNs for solving popular benchmark problems. Finally, the results are analyzed, discussed, and compared with such as from backpropagation (BP) (Levenberg-Marquardt) and differential evolution methods.

Original publication




Journal article


IEEE transactions on neural networks

Publication Date





937 - 942


Models, Statistical, Algorithms, Decision Support Techniques, Neural Networks (Computer), Artificial Intelligence, Computer Simulation, Information Storage and Retrieval, Pattern Recognition, Automated