正在加载图片...
Y2 Y1 DI D2 Q公A图汁豆和器器 yp yp dp dp FIGURE 20.1 The style of neural computation. output. This error information is fed back to the system and adjusts the system parameters in a systemati fashion( the learning rule). The process is repeated until the performance is acceptable. It is clear from this description that the performance hinges heavily on the data. If one does not have data that cover a significant portion of the operating conditions or if they are noisy, then neural network technology is probably not the right solution. On the other hand, if there is plenty of data and the problem is poorly understood to derive an approximate model, then neural network technology is a good choice. This operating procedure should be contrasted with the traditional engineering design, made of exhaustive subsystem specifications and intercommunication protocols In ANNs, the designer chooses the network topol ogy, the performance function, the learning rule, and the criterion to stop the training phase, but the system automatically adjusts the parameters. So, it is difficult to bring a priori information into the design, and when the system does not work properly it is also hard to incrementally refine the solution. But ANN-based solutions are extremely efficient in terms of development time and resources, and in many difficult problems ANNs provide performance that is difficult to match with other technologies. Denker 10 years ago said that "ANNs are the second best way to implement a solution"motivated by the simplicity of their design and because of their universality, only shadowed by the traditional design obtained by studying the physics of the problem.At resent, ANNs are emerging as the technology of choice for many applications, such as pattern recognition, prediction, system identification, and control ANN TyPes and Applications It is always risky to establish a taxonomy of a technology, but our motivation is one of providing a quick overview of the application areas and the most popular topologies and learning paradigms Association Hopfield [Zurada, 1992; Haykin, 1994 Hebbian [Zurada, 1992; Haykin, 1994; Kung, 1993] Multilayer perceptron [Zurada, 1992; Haykin, 1994; Back-propagation [Zurada, 1992; op,1995 Haykin, 1994; Bishop, 1995] Linear associative mem. [Zurada, 1992; Haykin, 1994 Hebbian Pattern Multilayer perceptron [Zurada, 1992; Haykin, 1994; Back-propagation Radial basis functions [Zurada, 1992: Bishop, 1995 Least mean squa [Bishop, 1995] Competitive [Zurada, 1992; Haykin, 1994 extraction Kohonen [Zurada, 1992; Haykin, 1994 Multilayer perceptron [Kung, 19931 Back-propagatio Principal comp. anal. [Zurada, 1992; Kung, 19931 Oja's [Zurada, 1992; Kung, 1993] Prediction, Time-lagged networks [Zurada, 1992; Kung, 1993; Back-propagation through time system ID de vries and Principe, 1992] [Zurada, 19921 Fully recurrent nets [Zurada, 1992] C 2000 by CRC Press LLC© 2000 by CRC Press LLC output. This error information is fed back to the system and adjusts the system parameters in a systematic fashion (the learning rule). The process is repeated until the performance is acceptable. It is clear from this description that the performance hinges heavily on the data. If one does not have data that cover a significant portion of the operating conditions or if they are noisy, then neural network technology is probably not the right solution. On the other hand, if there is plenty of data and the problem is poorly understood to derive an approximate model, then neural network technology is a good choice. This operating procedure should be contrasted with the traditional engineering design, made of exhaustive subsystem specifications and intercommunication protocols. In ANNs, the designer chooses the network topol￾ogy, the performance function, the learning rule, and the criterion to stop the training phase, but the system automatically adjusts the parameters. So, it is difficult to bring a priori information into the design, and when the system does not work properly it is also hard to incrementally refine the solution. But ANN-based solutions are extremely efficient in terms of development time and resources, and in many difficult problems ANNs provide performance that is difficult to match with other technologies. Denker 10 years ago said that “ANNs are the second best way to implement a solution” motivated by the simplicity of their design and because of their universality, only shadowed by the traditional design obtained by studying the physics of the problem. At present, ANNs are emerging as the technology of choice for many applications, such as pattern recognition, prediction, system identification, and control. ANN Types and Applications It is always risky to establish a taxonomy of a technology, but our motivation is one of providing a quick overview of the application areas and the most popular topologies and learning paradigms. FIGURE 20.1 The style of neural computation. Supervised Unsupervised Application Topology Learning Learning Association Hopfield [Zurada, 1992; Haykin, 1994] — Hebbian [Zurada, 1992; Haykin, 1994; Kung, 1993] Multilayer perceptron [Zurada, 1992; Haykin, 1994; Bishop, 1995] Back-propagation [Zurada, 1992; Haykin, 1994; Bishop, 1995] — Linear associative mem. [Zurada, 1992; Haykin, 1994] — Hebbian Pattern recognition Multilayer perceptron [Zurada, 1992; Haykin, 1994; Bishop, 1995] Back-propagation — Radial basis functions [Zurada, 1992; Bishop, 1995] Least mean square k-means [Bishop, 1995] Feature extraction Competitive [Zurada, 1992; Haykin, 1994] — Competitive Kohonen [Zurada, 1992; Haykin, 1994] — Kohonen Multilayer perceptron [Kung, 1993] Back-propagation — Principal comp. anal. [Zurada, 1992; Kung, 1993] — Oja’s [Zurada, 1992; Kung, 1993] Prediction, system ID Time-lagged networks [Zurada, 1992; Kung, 1993; de Vries and Principe, 1992] Back-propagation through time [Zurada, 1992] — Fully recurrent nets [Zurada, 1992]
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有