Computational Intelligence Laboratory (CIL),
Institute of Informatics and Telecommunications
National Centre for Scientific Research “Demokritos”
Contact person: Stavros Perantonis

Supervised Learning, Neural Networks

Novel supervised learning algorithms for feedforward networks have been proposed incorporating additional knowledge in the learning rule using constrained optimization [Per00a]. This has been shown to lead to the formulation of efficient Constrained Learning Algorithms with accelerated learning properties [Kar95, Per95, Amp02, Amp01, Per00b, Amp99]. The additional knowledge can be encoded in the form of objectives leading to single – or multi-objective optimization criteria that have to be satisfied simultaneously with the demand for a long-term decrease of the cost function.

In this approach, an optimization problem is formulated at each epoch of the learning process. The additional information incorporated into the algorithm can be either of a general nature (related to exploitation of certain features of the cost function landscape) or problem specific (exploiting specific characteristics of the application whose solution is sought by training a feedforward neural network). The family of proposed algorithms comprises both first order algorithms involving only gradient information [Kar95, Per95], as well as second order algorithms taking into account additional Hessian information [Amp02].

These algorithms have been widely cited in the literature and used by other authors in many different applications such as eye detection and face detection, gender recognition, dynamics identification and video quality estimation [Amp98] among other applications. Another successful application of the constrained learning algorithms that has gained momentum in recent years is numerical polynomial factorization and root finding.

Feature Extraction

In [Pet04] a general framework is proposed for feature generation in pattern recognition problems taking into account class information. This framework unifies state-of-the art feature extraction methods, like linear discriminant analysis, heteroscedastic discriminant analysis, maximization of mutual information as well as neural network based methods [Per99] under a general information theoretic framework, and serves as a spring board for the development of new efficient supervised feature generation algorithms.