-
Something wrong with this record ?
Constructing general partial differential equations using polynomial and neural networks
L. Zjavka, W. Pedrycz,
Language English Country United States
Document type Journal Article, Research Support, Non-U.S. Gov't
- MeSH
- Algorithms MeSH
- Data Interpretation, Statistical MeSH
- Mathematics * MeSH
- Nonlinear Dynamics MeSH
- Neural Networks, Computer * MeSH
- Weather MeSH
- Computer Simulation MeSH
- Machine Learning MeSH
- Publication type
- Journal Article MeSH
- Research Support, Non-U.S. Gov't MeSH
Sum fraction terms can approximate multi-variable functions on the basis of discrete observations, replacing a partial differential equation definition with polynomial elementary data relation descriptions. Artificial neural networks commonly transform the weighted sum of inputs to describe overall similarity relationships of trained and new testing input patterns. Differential polynomial neural networks form a new class of neural networks, which construct and solve an unknown general partial differential equation of a function of interest with selected substitution relative terms using non-linear multi-variable composite polynomials. The layers of the network generate simple and composite relative substitution terms whose convergent series combinations can describe partial dependent derivative changes of the input variables. This regression is based on trained generalized partial derivative data relations, decomposed into a multi-layer polynomial network structure. The sigmoidal function, commonly used as a nonlinear activation of artificial neurons, may transform some polynomial items together with the parameters with the aim to improve the polynomial derivative term series ability to approximate complicated periodic functions, as simple low order polynomials are not able to fully make up for the complete cycles. The similarity analysis facilitates substitutions for differential equations or can form dimensional units from data samples to describe real-world problems.
Department of Electrical and Computer Engineering University of Alberta Edmonton T6R 2V4 AB Canada
Systems Research Institute Polish Academy of Sciences Warsaw Poland
References provided by Crossref.org
- 000
- 00000naa a2200000 a 4500
- 001
- bmc16027988
- 003
- CZ-PrNML
- 005
- 20161027120850.0
- 007
- ta
- 008
- 161005s2016 xxu f 000 0|eng||
- 009
- AR
- 024 7_
- $a 10.1016/j.neunet.2015.10.001 $2 doi
- 024 7_
- $a 10.1016/j.neunet.2015.10.001 $2 doi
- 035 __
- $a (PubMed)26547244
- 040 __
- $a ABA008 $b cze $d ABA008 $e AACR2
- 041 0_
- $a eng
- 044 __
- $a xxu
- 100 1_
- $a Zjavka, Ladislav $u VŠB-Technical University of Ostrava, Faculty of Electrical Engineering and Computer Science, Department of Computer Science, 17. listopadu 15/2172 Ostrava, Czech Republic. Electronic address: lzjavka@gmail.com.
- 245 10
- $a Constructing general partial differential equations using polynomial and neural networks / $c L. Zjavka, W. Pedrycz,
- 520 9_
- $a Sum fraction terms can approximate multi-variable functions on the basis of discrete observations, replacing a partial differential equation definition with polynomial elementary data relation descriptions. Artificial neural networks commonly transform the weighted sum of inputs to describe overall similarity relationships of trained and new testing input patterns. Differential polynomial neural networks form a new class of neural networks, which construct and solve an unknown general partial differential equation of a function of interest with selected substitution relative terms using non-linear multi-variable composite polynomials. The layers of the network generate simple and composite relative substitution terms whose convergent series combinations can describe partial dependent derivative changes of the input variables. This regression is based on trained generalized partial derivative data relations, decomposed into a multi-layer polynomial network structure. The sigmoidal function, commonly used as a nonlinear activation of artificial neurons, may transform some polynomial items together with the parameters with the aim to improve the polynomial derivative term series ability to approximate complicated periodic functions, as simple low order polynomials are not able to fully make up for the complete cycles. The similarity analysis facilitates substitutions for differential equations or can form dimensional units from data samples to describe real-world problems.
- 650 _2
- $a algoritmy $7 D000465
- 650 _2
- $a počítačová simulace $7 D003198
- 650 _2
- $a interpretace statistických dat $7 D003627
- 650 _2
- $a strojové učení $7 D000069550
- 650 12
- $a matematika $7 D008433
- 650 12
- $a neuronové sítě $7 D016571
- 650 _2
- $a nelineární dynamika $7 D017711
- 650 _2
- $a počasí $7 D014887
- 655 _2
- $a časopisecké články $7 D016428
- 655 _2
- $a práce podpořená grantem $7 D013485
- 700 1_
- $a Pedrycz, Witold $u Department of Electrical & Computer Engineering, University of Alberta, Edmonton T6R 2V4 AB, Canada; Department of Electrical and Computer Engineering, Faculty of Engineering, King Abdulaziz University, Jeddah, 21589, Saudi Arabia; Systems Research Institute, Polish Academy of Sciences Warsaw, Poland. Electronic address: wpedrycz@ualberta.ca.
- 773 0_
- $w MED00011811 $t Neural networks the official journal of the International Neural Network Society $x 1879-2782 $g Roč. 73, č. - (2016), s. 58-69
- 856 41
- $u https://pubmed.ncbi.nlm.nih.gov/26547244 $y Pubmed
- 910 __
- $a ABA008 $b sig $c sign $y a $z 0
- 990 __
- $a 20161005 $b ABA008
- 991 __
- $a 20161027121307 $b ABA008
- 999 __
- $a ok $b bmc $g 1166302 $s 952618
- BAS __
- $a 3
- BAS __
- $a PreBMC
- BMC __
- $a 2016 $b 73 $c - $d 58-69 $e 20151020 $i 1879-2782 $m Neural networks $n Neural Netw $x MED00011811
- LZP __
- $a Pubmed-20161005