Approximation by fully complex multilayer perceptrons
- PMID: 12816570
- DOI: 10.1162/089976603321891846
Approximation by fully complex multilayer perceptrons
Abstract
We investigate the approximation ability of a multilayer perceptron (MLP) network when it is extended to the complex domain. The main challenge for processing complex data with neural networks has been the lack of bounded and analytic complex nonlinear activation functions in the complex domain, as stated by Liouville's theorem. To avoid the conflict between the boundedness and the analyticity of a nonlinear complex function in the complex domain, a number of ad hoc MLPs that include using two real-valued MLPs, one processing the real part and the other processing the imaginary part, have been traditionally employed. However, since nonanalytic functions do not meet the Cauchy-Riemann conditions, they render themselves into degenerative backpropagation algorithms that compromise the efficiency of nonlinear approximation and learning in the complex vector field. A number of elementary transcendental functions (ETFs) derivable from the entire exponential function e(z) that are analytic are defined as fully complex activation functions and are shown to provide a parsimonious structure for processing data in the complex domain and address most of the shortcomings of the traditional approach. The introduction of ETFs, however, raises a new question in the approximation capability of this fully complex MLP. In this letter, three proofs of the approximation capability of the fully complex MLP are provided based on the characteristics of singularity among ETFs. First, the fully complex MLPs with continuous ETFs over a compact set in the complex vector field are shown to be the universal approximator of any continuous complex mappings. The complex universal approximation theorem extends to bounded measurable ETFs possessing a removable singularity. Finally, it is shown that the output of complex MLPs using ETFs with isolated and essential singularities uniformly converges to any nonlinear mapping in the deleted annulus of singularity nearest to the origin.
Similar articles
-
Dynamics of learning near singularities in layered networks.Neural Comput. 2008 Mar;20(3):813-43. doi: 10.1162/neco.2007.12-06-414. Neural Comput. 2008. PMID: 18045020
-
Generalized neuron: feedforward and recurrent architectures.Neural Netw. 2009 Sep;22(7):1011-7. doi: 10.1016/j.neunet.2009.07.027. Epub 2009 Jul 25. Neural Netw. 2009. PMID: 19660907
-
Specification of training sets and the number of hidden neurons for multilayer perceptrons.Neural Comput. 2001 Dec;13(12):2673-80. doi: 10.1162/089976601317098484. Neural Comput. 2001. PMID: 11705406
-
Multilayer perceptrons to approximate complex valued functions.Int J Neural Syst. 1995 Dec;6(4):435-46. doi: 10.1142/s0129065795000299. Int J Neural Syst. 1995. PMID: 8963472 Review.
-
Nonlinear complex-valued extensions of Hebbian learning: an essay.Neural Comput. 2005 Apr;17(4):779-838. doi: 10.1162/0899766053429381. Neural Comput. 2005. PMID: 15829090 Review.
Cited by
-
Steady-State Visual Evoked Potential Classification Using Complex Valued Convolutional Neural Networks.Sensors (Basel). 2021 Aug 6;21(16):5309. doi: 10.3390/s21165309. Sensors (Basel). 2021. PMID: 34450751 Free PMC article.
-
Convergence analysis of fully complex backpropagation algorithm based on Wirtinger calculus.Cogn Neurodyn. 2014 Jun;8(3):261-6. doi: 10.1007/s11571-013-9276-7. Epub 2014 Jan 3. Cogn Neurodyn. 2014. PMID: 24808934 Free PMC article.
-
Exploring Feasibility of Multivariate Deep Learning Models in Predicting COVID-19 Epidemic.Front Public Health. 2021 Jul 5;9:661615. doi: 10.3389/fpubh.2021.661615. eCollection 2021. Front Public Health. 2021. PMID: 34291025 Free PMC article.
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources