正在加载图片...
第6期 乔俊飞,等:随机权神经网络研究现状与展望 .765· treme learning machines:new trends and applications[J]. ognition using a predictive echo state network classifier[J] Science China information sciences,2015,58(2):1-16. Neural networks,2007,20(3):414-423. [13]PAO Y H,TAKEFUJI Y.Functional-link net computing: [26]LI Decai,HAN Min,WANG Jun.Chaotic time series pre- theory,system architecture,and functionalities[J].Com- diction based on a novel robust echo state network J]. puter,1992,25(5):76-79. IEEE transactions on neural networks and learning sys- [14]HUANG Guangbin,ZHU Qinyu,SIEW C K.Extreme tems,2012,23(5):787-799. learning machine:theory and applications[].Neurocom- [27]SCHMIDT W F,KRAAIJVELD M A,DUIN R P W. puting,2006,70(1/2/3):489-501. Feedforward neural networks with random weights[C]// [15]RAHIMI A,RECHT B.Weighted sums of random kitchen Proceedings of 11th International Conference on Pattern sinks:replacing minimization with randomization in learn- Recognition Methodology and Systems.Hague,Holland: ing[C]//Proceedings of the 21st International Conference IEEE,1992:1-4. [28]IGELNIK B,PAO Y H.Stochastic choice of basis func- on Neural Information Processing Systems.Vancouver, British Columbia,Canada:Curran Associates Inc.,2008: tions in adaptive function approximation and the Function- al-link neural net[J].IEEE transactions on neural net- 1313-1320. work8,1995,6(6):1320-1329. [16]WIDROW B,GREENBLATT A,KIM Y,et al.The No- [29]HUANG Guangbin.An insight into extreme learning ma- Prop algorithm:A new learning algorithm for multilayer chines:random neurons,random features and kernels[J]. neural networks J].Neural networks,2013,37:182- Cognitive computation,2014,6(3):376-390. 188. [30]HUANG Guangbin,ZHU Qinyu,SIEW C K.Extreme [17]KASUN L L C,ZHOU Hongming,HUANG Guangbin,et learning machine:theory and applications[].Neurocom- al.Representational learning with ELMs for big data[J] puting,2006,70(1/2/3):489-501. IEEE intelligent systems,2013,28(6):31-34. [31]HUANG G B,CHEN L,SIEW C K.Universal approxima- [18]TANG Jiexiong,DENG Chenwei,HUANG Guangbin.Ex- tion using incremental constructive feedforward networks treme learning machine for multilayer perceptron[J].IEEE with random hidden nodes[J].IEEE transactions on neural transactions on neural networks and learning systems, networks,.2006,17(4):879-92. 2016,27(4):809-821. [32]LIU Xia,LIN Shaobo,FANG Jian,et al.Is extreme learn- [19]QIAO Junfei,LI Fanjun,HAN Honggui,et al.Construc- ing machine feasible?a theoretical assessment Part 1) tive algorithm for fully connected cascade feedforward neu- [J].IEEE transactions on neural networks and learning ral networks[J].Neurocomputing,2016,182:154-164. systems,2014,26(1):7-20. [20]WAN Yihe,SONG Shiji,HUANG Gao.Incremental ex- [33]DEHURI S,CHO S B.A comprehensive survey on func- treme learning machine based on cascade neural networks tional link neural networks and an adaptive PSO-BP learn- [C]//IEEE International Conference on Systems,Man, ing for CFLNN[J].Neural computing and applications, and Cybernetics.Kowloon:IEEE,2015:1889-1894. 2010,19(2):187-205. [21]STEIL J J.Memory in backpropagation-decorrelation O(N) [34]ZHANG Le,SUGANTHAN P N.A comprehensive evalua- efficient online recurrent learning[C]//Proceedings of the tion of random vector functional link networks[].Informa- 15th International Conference on Artificial Neural Net- tion sciences,2015,367/368:1094-1105. works:Formal Models and Their Applications.Berlin Hei- [35]LIANG Nanying,HUANG Guangbin,SARATCHANDRAN delberg:Springer,2005:750-750. P,et al.A fast and accurate online sequential learning al- [22]HERBERT J.The "echo state"approach to analyzing and gorithm for feedforward networks[J].IEEE transactions on training recurrent neural networks with an erratum note neural networks,2006,17(6):1411-1423. [R].Bonn,Germany:German National Research Center [36]SCARDAPANE S,WANG Dianhui,PANELLA M,et al. for Information Technology,2001. Distributed learning for random vector functional-link net- [23]MAASS W.Liquid state machines:motivation,theory, works[J].Information sciences,2015,301:271-284. and applications[J].Computability in context,2010:275 [37]ALHAMDOOSH M,WANG Dianhui.Fast decorrelated -296.D0I:10.1142/97818481627780008 neural network ensembles with random weights[J].Infor- [24]LIANG Nanying,SARATCHANDRAN P,HUANG Guang- mation sciences,2014,264:104-117. bin,et al.Classification of mental tasks from EEG signals [38]LI Ying.Orthogonal incremental extreme learning machine using extreme learning machine[J.International journal of for regression and multiclass classification[]].Neural com- neural systems,2006,16(1):29-38. puting and applications,2014,27(1):111-120. [25]SKOWRONSKI M D,HARRIS J G.Automatic speech rec- [39]李凡军,乔俊飞,韩红桂.网络结构增长的极端学习机treme learning machines: new trends and applications[ J]. Science China information sciences, 2015, 58(2): 1-16. [13]PAO Y H, TAKEFUJI Y. Functional⁃link net computing: theory, system architecture, and functionalities[ J]. Com⁃ puter, 1992, 25(5): 76-79. [14] HUANG Guangbin, ZHU Qinyu, SIEW C K. Extreme learning machine: theory and applications[J]. Neurocom⁃ puting, 2006, 70(1 / 2 / 3): 489-501. [15]RAHIMI A, RECHT B. Weighted sums of random kitchen sinks: replacing minimization with randomization in learn⁃ ing[C] / / Proceedings of the 21st International Conference on Neural Information Processing Systems. Vancouver, British Columbia, Canada: Curran Associates Inc., 2008: 1313-1320. [16]WIDROW B, GREENBLATT A, KIM Y, et al. The No⁃ Prop algorithm: A new learning algorithm for multilayer neural networks [ J]. Neural networks, 2013, 37: 182 - 188. [17]KASUN L L C, ZHOU Hongming, HUANG Guangbin, et al. Representational learning with ELMs for big data[ J]. IEEE intelligent systems, 2013, 28(6): 31-34. [18]TANG Jiexiong, DENG Chenwei, HUANG Guangbin. Ex⁃ treme learning machine for multilayer perceptron[J]. IEEE transactions on neural networks and learning systems, 2016, 27(4): 809-821. [19]QIAO Junfei, LI Fanjun, HAN Honggui, et al. Construc⁃ tive algorithm for fully connected cascade feedforward neu⁃ ral networks[J]. Neurocomputing, 2016, 182: 154-164. [20] WAN Yihe, SONG Shiji, HUANG Gao. Incremental ex⁃ treme learning machine based on cascade neural networks [C] / / IEEE International Conference on Systems, Man, and Cybernetics. Kowloon: IEEE, 2015: 1889-1894. [21]STEIL J J. Memory in backpropagation⁃decorrelation O(N) efficient online recurrent learning[C] / / Proceedings of the 15th International Conference on Artificial Neural Net⁃ works: Formal Models and Their Applications. Berlin Hei⁃ delberg: Springer, 2005: 750-750. [22]HERBERT J. The “echo state” approach to analyzing and training recurrent neural networks with an erratum note [R]. Bonn, Germany: German National Research Center for Information Technology, 2001. [23] MAASS W. Liquid state machines: motivation, theory, and applications[J]. Computability in context, 2010: 275 -296. DOI: 10.1142 / 9781848162778_0008. [24]LIANG Nanying, SARATCHANDRAN P, HUANG Guang⁃ bin, et al. Classification of mental tasks from EEG signals using extreme learning machine[J]. International journal of neural systems, 2006, 16(1): 29-38. [25]SKOWRONSKI M D, HARRIS J G. Automatic speech rec⁃ ognition using a predictive echo state network classifier[J]. Neural networks, 2007, 20(3): 414-423. [26]LI Decai, HAN Min, WANG Jun. Chaotic time series pre⁃ diction based on a novel robust echo state network [ J]. IEEE transactions on neural networks and learning sys⁃ tems, 2012, 23(5): 787-799. [27] SCHMIDT W F, KRAAIJVELD M A, DUIN R P W. Feedforward neural networks with random weights[C] / / Proceedings of 11th International Conference on Pattern Recognition Methodology and Systems. Hague, Holland: IEEE, 1992: 1-4. [28]IGELNIK B, PAO Y H. Stochastic choice of basis func⁃ tions in adaptive function approximation and the Function⁃ al⁃link neural net [ J]. IEEE transactions on neural net⁃ works, 1995, 6(6): 1320-1329. [29] HUANG Guangbin. An insight into extreme learning ma⁃ chines: random neurons, random features and kernels[J]. Cognitive computation, 2014, 6(3): 376-390. [30] HUANG Guangbin, ZHU Qinyu, SIEW C K. Extreme learning machine: theory and applications[J]. Neurocom⁃ puting, 2006, 70(1 / 2 / 3): 489-501. [31]HUANG G B, CHEN L, SIEW C K. Universal approxima⁃ tion using incremental constructive feedforward networks with random hidden nodes[J]. IEEE transactions on neural networks, 2006, 17(4): 879-92. [ 32]LIU Xia, LIN Shaobo, FANG Jian, et al. Is extreme learn⁃ ing machine feasible? a theoretical assessment ( Part I) [ J]. IEEE transactions on neural networks and learning systems, 2014, 26(1): 7-20. [33]DEHURI S, CHO S B. A comprehensive survey on func⁃ tional link neural networks and an adaptive PSO⁃BP learn⁃ ing for CFLNN[ J]. Neural computing and applications, 2010, 19(2): 187-205. [34]ZHANG Le, SUGANTHAN P N. A comprehensive evalua⁃ tion of random vector functional link networks[J]. Informa⁃ tion sciences, 2015, 367 / 368: 1094-1105. [35]LIANG Nanying, HUANG Guangbin, SARATCHANDRAN P, et al. A fast and accurate online sequential learning al⁃ gorithm for feedforward networks[J]. IEEE transactions on neural networks, 2006, 17(6): 1411-1423. [36]SCARDAPANE S, WANG Dianhui, PANELLA M, et al. Distributed learning for random vector functional⁃link net⁃ works[J]. Information sciences, 2015, 301: 271-284. [37] ALHAMDOOSH M, WANG Dianhui. Fast decorrelated neural network ensembles with random weights[ J]. Infor⁃ mation sciences, 2014, 264: 104-117. [38]LI Ying. Orthogonal incremental extreme learning machine for regression and multiclass classification[J]. Neural com⁃ puting and applications, 2014, 27(1): 111-120. [39]李凡军, 乔俊飞, 韩红桂. 网络结构增长的极端学习机 第 6 期 乔俊飞,等:随机权神经网络研究现状与展望 ·765·
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有