Complex-Valued Neural Networks with Multi-Valued Neurons

 

2013 International Joint Conference on Neural Networks (IJCNN-2013)

Dallas, Texas, USA, August 4, 2013, 8:00am

 

IJCNN-2013 Tutorial by

 

Igor Aizenberg

 

Texas A&M University-Texarkana, USA

e-mail: igor.aizenberg at tamut.edu URL: http://www.eagle.tamut.edu/faculty/igor

 

Scope

 

The Complex-Valued Neural Networks (CVNNs) is a quickly growing area that attracts more and more researchers. There is a line of the CVNN Special Sessions organized during last years, for example, at ICONIP 2002, Singapore, ICANN/ICONIP 2003, Istanbul, ICONIP 2004, Calcutta, WCCI-IJCNN 2006, Vancouver, "Fuzzy Days 2006", Dortmund, ICANN 2007, Porto, WCCI-IJCNN 2008, Hong Kong, IJCNN 2009, Atlanta, WCCI-IJCNN 2010, Barcelona, IJCNN 2011, San Jose, WCCI-IJCNN 2012, Brisbane. Everywhere these sessions had large audience, which is growing continuously. There were many interesting presentations and very productive discussions.

 

Due to the computational and theoretical advantages that processing in the complex domain offers over the real-valued domain, the area of complex-valued neural networks is one of fastest growing research areas in the neural network community. In addition, recent progress in pattern recognition, robotics, mathematical biosciences, brain-computer interface design has brought to light problems where nonlinearity, multidimensional data natures, uncertainty, and complexity play major roles complex-valued neural networks are a natural model to account for these applications.

The most important notion underlying the theory of complex-valued neural networks is that of the phase information. This enables us to employ advanced concepts, such as phase synchrony and coherence, and to model simultaneously the amplitude-phase relationships for a range of computational scenarios.

 

The Multi-Valued Neuron (MVN) is a complex-valued neuron with the inputs and output located on the unit circle. MVN has a circular activation function, which depends only on phase and projects a weighted sum onto the unit circle. These specific properties determine many unique advantages of MVN. The most important of them are the ability of MVN to learn non-linearly separable input/output mappings without any network and simplicity of derivative-free learning, which is based on the error-correction rule. For example, such classical non-linearly separable problems as XOR and Parity n are the simplest problems, which can be easily learned by a single MVN with a periodic activation function, without any network.

 

MVN-based complex-valued neural networks also have a number of unique advantages.

For example, the Multilayer Feedforward Neural Network with Multi-Valued Neurons (MLMVN) significantly outperforms a classical multilayer perceptron (MLP) and many kernel-based techniques in terms of generalization capability and the number of parameters employed. MLMVN learns significantly faster than MLP, and MLMVN's learning algorithm, as well as the error backpropagation rule, is derivative-free. The MLMVN learning algorithm is based on the same error-correction learning rule as the MVN learning algorithm. MVN-based Hopfield neural networks have shown unique capabilities as associative memories.

Complex-valued neural network models have been shown not only to exhibit enhanced accuracy, but also to facilitate physical interpretation of their variables. One of the recent applications of MLMVN is its use for decoding of signals in EEG-based brain-computer interfaces.

The circularity of the MVN activation function is natural and very suitable in applications where the processed and analyzed signals are represented in the frequency domain. It can be especially interesting to use MVN in studies devoted to modeling and simulations of biological neurons.

The synergy of complex nonlinearity, circularity, the ability to separate linearly those mappings, which are not separable in the real domain, underpins this tutorial, which aims at providing a rigorous unifying framework for the design, analysis, and interpretation of complex neural network models. The material is supported by detailed case studies across learning, pattern recognition, image processing, mathematical biosciences, and computational neuroscience, highlighting the practical usefulness of MVN-based complex-valued neural networks.

This tutorial represents a quantum step forward from the presenter's earlier tutorials given together with Prof. Danilo mandic and Prof. Akira Hirose at IJCNN-2010 and IJCNN-2011. It provides a complete and rigorous overview of state-of-the-art in the area, supported by practical applications and working solutions.

The presenter has a recent fundamental research monograph in the area (published by Springer in 2011) focusing on neural networks with multi-valued neurons and their applications in pattern recognition and classification. The presentation will be based on this monograph and recent journal publications followed it. The material will be explicitly illustrated and a number of practical applications in pattern recognition, classification, intelligent image processing and time series prediction will be considered in detail.

 

Contents of the Tutorial

 

  1. Brief introduction. Complex-valued neural networks: why we need them?
  2. Multiple-valued (k-valued) logic over the field of complex numbers. k-separability of n-dimensional space.
    A multi-valued neuron (MVN) and its functionality. Discrete and continuous MVN.
  3. Learning rules for MVN. The Hebbian rule. The "closeness" rule. The error-correction rule. MVN learning algorithm and its convergence.
    Choice of the best starting weights for the learning process.
  4. MVN with a periodic activation function (MVN-P) and solving non-linearly separable problems using a single MVN-P
    (XOR, parity n, mod k addition of n inputs, various benchmark problems).
  5. A multilayer feedforward neural network based on multi-valued neurons (MLMVN). The error backpropagation and its specific organization for the MLMVN.
    The error-correction learning rule for MLMVN.
  6. A derivative-free learning MLMVN learning algorithm based on the error-correction learning rule and its convergence. Hard Margins learning and soft margins learning for MLMVN.
  7. Solving the popular benchmark classification and prediction problems and comparison with the competitive solutions (standard backpropagation network, kernel-based networks, SVM, neuro-fuzzy networks).
  8. Application of MLMVN for solving real-world problems: blur and blur parameters identification for image deblurring; recognition of blurred images; intelligent edge detection; detection of impulse noise; time-series prediction; classification of microarray gene expression data. Frequency domain as a natural source of the features for the classification purposes.
  9. MLMVN as a signal decoder in an EEG-based brain-computer interface. Similarity of MVN and biological neurons.
  10. MVN-based associative memories and their applications.

 

 

 

Disclaimer | Texas A&M University-Texarkana Disclaimer:
Texas A&M University-Texarkana is not responsible for the content of this organization or individual web site. The views and opinions expressed on this site are those of the site developer or organization. The contents of this site have not been reviewed or approved by Texas A&M University-Texarkana or the Department of Technology and Distance Education.