正在加载图片...
brain.Modeling a biological nervous system using ANNs can also increase our understanding of biological functions. I.Von Neumann computer versus biological neural system. State-of-the-art computer hardware technology (such as Von Neumann Biological VLSI and optical)has made this modeling feasible. A thorough study of ANNs requires knowledge of neu- computer neural system rophysiology,cognitive science/psychology,physics (sta- 550 Complex Simple tistical mechanics),control theory,computer science, High speed Low speed artificial intelligence,statistics/mathematics,pattern One or a few A large number recognition,computer vision,parallel processing,and hardware (digital/analog/VLSI/optical).New develop Memory Separate from a processor Integrated into ments in these disciplines continuously nourish the field Localized processor On the other hand,ANNs also provide an impetus to these Noncontent addressable Distributed disciplines in the form of new tools and representations. Content addressable This symbiosis is necessary for the vitality of neural net- work research.Communications among these disciplines Computing Centralized Distributed ought to be encouraged. Sequential Parallel Stored programs Self-learning Brief historical review ANN research has experienced three periods of exten- Reliability very vulnerable Robust sive activity.The first peak in the 1940s was due to McCulloch and Pitts'pioneering work.+The second Expertise Numerical and symbolic Perceptual occurred in the 1960s with Rosenblatt's perceptron con- manipulations problems vergence theorem5 and Minsky and Papert's work showing the limitations of a simple perceptron."Minsky and Operating Well-defined, Poorly defined, Papert's results dampened the enthusiasm of most environment well-constrained unconstrained researchers,especially those in the computer science com- munity.The resulting lull in neural network research lasted almost 20 years.Since the early 1980s,ANNs have received considerable renewed interest.The major devel- opments behind this resurgence include Hopfield's energy approach?in 1982 and the back-propagation learning algorithm for multilayer perceptrons(multilayer feed- forward networks)first proposed by Werbos,reinvented several times,and then popularized by Rumelhart et al. in 1986.Anderson and Rosenfeld1 provide a detailed his- torical account of ANN developments. Biological neural networks A neuron (or nerve cell)is a special biological cell that processes information (see Figure 1).It is composed of a cell body,or soma,and two types of out-reaching tree-like Figure 1.A sketch of a biological neuron. branches:the axon and the dendrites.The cell body has a nucleus that contains information about hereditary traits and a plasma that holds the molecular equipment for pro rons about 2 to 3 millimeters thick with a surface area of ducing material needed by the neuron.A neuron receives about 2,200 cm2,about twice the area of a standard com- signals (impulses)from other neurons through its dendrites puter keyboard.The cerebral cortex contains about 101 (receivers)and transmits signals generated by its cell body neurons,which is approximately the number of stars in the along the axon (transmitter),which eventually branches Milky Way.Neurons are massively connected,much more into strands and substrands.At the terminals of these complex and dense than telephone networks.Each neuron strands are the synapses.A synapse is an elementary struc. is connected to 102 to 10other neurons.In total,thehuman ture and functional unit between two neurons (an axon brain contains approximately 101 to 1015 interconnections strand of one neuron and a dendrite of another).When the Neurons communicate through a very short train of impulse reaches the synapse's terminal,certain chemicals pulses,typically milliseconds in duration.The message is called neurotransmitters are released.The neurotransmit- modulated on the pulse-transmission frequency.This fre- ters diffuse across the synaptic gap,to enhance or inhibit quency can vary from a few to several hundred hertz,which depending on the type of the synapse,the receptor neuron's is a million times slower than the fastest switching speed in own tendency to emit electrical impulses.The synapse's electronic circuits.However,complex perceptual decisions effectiveness can be adjusted by the signals passing through such as face recognition are typically made by humans it so that the synapses can learn from the activities in which within a few hundred milliseconds.These decisions are they participate.This dependence on history acts as a mem- made by a network of neurons whose operational speed is ory,which is possibly responsible for human memory. only a few milliseconds.This implies that the computations The cerebral cortex in humans is a large flat sheet of neu- cannot take more than about 100 serial stages.In other March 1996 33brain. Modeling a biological nervous system using A"s can also increase our understanding of biological functions. State-of-the-art computer hardware technology (such as VLSI and optical) has made this modeling feasible. A thorough study of A"s requires knowledge of neu￾rophysiology, cognitive science/psychology, physics (sta￾tistical mechanics), control theory, computer science, artificial intelligence, statistics/mathematics, pattern recognition, computer vision, parallel processing, and hardware (digital/analog/VLSI/optical) . New develop￾ments in these disciplines continuously nourish the field. On the other hand, ANNs also provide an impetus to these disciplines in the form of new tools and representations. This symbiosis is necessary for the vitality of neural net￾work research. Communications among these disciplines ought to be encouraged. Brief historical review ANN research has experienced three periods of exten￾sive activity. The first peak in the 1940s was due to McCulloch and Pitts' pioneering The second occurred in the 1960s with Rosenblatt's perceptron con￾vergence theorem5 and Minsky and Papert's work showing the limitations of a simple perceptron.6 Minsky and Papert's results dampened the enthusiasm of most researchers, especially those in the computer science com￾munity. The resulting lull in neural network research lasted almost 20 years. Since the early 1980s, ANNs have received considerable renewed interest. The major devel￾opments behind this resurgence include Hopfield's energy approach7 in 1982 and the back-propagation learning algorithm for multilayer perceptrons (multilayer feed￾forward networks) first proposed by Werbos,8 reinvented several times, and then popularized by Rumelhart et aL9 in 1986. Anderson and RosenfeldlO provide a detailed his￾torical account of ANN developments. Biological neural networks A neuron (or nerve cell) is a special biological cell that processes information (see Figure 1). It is composed of a cell body, or soma, and two types of out-reaching tree-like branches: the axon and the dendrites. The cell body has a nucleus that contains information about hereditary traits and a plasma that holds the molecular equipment for pro￾ducing material needed by the neuron. A neuron receives signals (impulses) from other neurons through its dendrites (receivers) and transmits signals generated by its cell body along the axon (transmitter), which eventually branches into strands and substrands. At the terminals of these strands are the synapses. A synapse is an elementary struc￾ture and functional unit between two neurons (an axon strand of one neuron and a dendrite of another), When the impulse reaches the synapse's terminal, certain chemicals called neurotransmitters are released. The neurotransmit￾ters diffuse across the synaptic gap, to enhance or inhibit, depending on the type of the synapse, the receptor neuron's own tendency to emit electrical impulses. The synapse's effectiveness can be adjusted by the signals passing through it so that the synapses can learn from the activities in which they participate. This dependence on history acts as amem￾ory, which is possibly responsible for human memory. The cerebral cortex in humans is a large flat sheet of neu￾Figure 1. A sketch of a biological neuron. ions about 2 to 3 millimeters thick with a surface area of about 2,200 cm2, about twice the area of a standard com￾puter keyboard. The cerebral cortex contains about 10" neurons, which is approximately the number of stars in the Milky Way." Neurons are massively connected, much more complex and dense than telephone networks. Each neuron is connected to 103 to lo4 other neurons. In total, the human brain contains approximately 1014 to loi5 interconnections. Neurons communicate through a very short train of pulses, typically milliseconds in duration. The message is modulated on the pulse-transmission frequency. This fre￾quency can vary from a few to several hundred hertz, which is a million times slower than the fastest switching speed in electronic circuits. However, complex perceptual decisions such as face recognition are typically made by humans within a few hundred milliseconds. These decisions are made by a network of neurons whose operational speed is only a few milliseconds. This implies that the computations cannot take more than about 100 serial stages. In other March 1996
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有