Subrecursive neural networks

. 2019 Aug ; 116 () : 208-223. [epub] 20190520

Jazyk angličtina Země Spojené státy americké Médium print-electronic

Typ dokumentu časopisecké články

Perzistentní odkaz   https://www.medvik.cz/link/pmid31121419

Odkazy
PubMed 31121419
DOI 10.1016/j.neunet.2019.04.019
PII: S0893-6080(19)30125-X
Knihovny.cz E-zdroje

It has been known for discrete-time recurrent neural networks (NNs) that binary-state models using the Heaviside activation function (with Boolean outputs 0 or 1) are equivalent to finite automata (level 3 in the Chomsky hierarchy), while analog-state NNs with rational weights, employing the saturated-linear function (with real-number outputs in the interval [0,1]), are Turing complete (Chomsky level 0) even for three analog units. However, it is as yet unknown whether there exist subrecursive (i.e. sub-Turing) NN models which occur on Chomsky levels 1 or 2. In this paper, we provide such a model which is a binary-state NN extended with one extra analog unit (1ANN). We achieve a syntactic characterization of languages that are accepted online by 1ANNs in terms of so-called cut languages which are combined in a certain way by usual operations. We employ this characterization for proving that languages accepted by 1ANNs with rational weights are context-sensitive (Chomsky level 1) and we present explicit examples of such languages that are not context-free (i.e. are above Chomsky level 2). In addition, we formulate a sufficient condition when a 1ANN recognizes a regular language (Chomsky level 3) in terms of quasi-periodicity of parameters derived from its real weights, which is satisfied e.g. for rational weights provided that the inverse of the real self-loop weight of the analog unit is a Pisot number.

Najít záznam

Citační ukazatele

Nahrávání dat...

Možnosti archivace

Nahrávání dat...