正在加载图片...
6.001 Structure and Interpretation of Computer Programs. Copyright o 2004 by Massachusetts Institute of Technology Slide 32.1.4 The role of time in evaluation The apparently simple introduction of local state and mutation into ((petor ' deposit) 5) our language thus has some drastic consequences: it raises value: 9 : value: 105 questions about sameness and change ' deposit)5) g beneath the complexity sameness, and change is that by introducing assignment we are Order of evaluation doesn't forced to admit time into our computational models. Before we Order of evaluation does matter introduced assignment, all our programs were timeless, in the sense that any expression that has a value al ways has the same value. Thus, calling (d1 5) would al ways return the same value In contrast, look at our modeling of deposits to a bank account, that returns the resulting balance here successive evaluations of he same expression yield different values. This behavior arise from the fact that the execution of assignment statements (in this case, assignments to the variable balance) delineates moments in time when values change. The result of evaluating an expression depends not only on the of computational objects with local state forces us to confront time as an essential concept in programming n terms expression itself, but also on whether the evaluation occurs before or after these moments. Building models Slide 32. 1.5 We can go further in structuring computational models to match Role of concurrency and time our perception of the physical world. Objects in the world do not Behavior of objects with state depends on sequence of change one at a time in sequence rather we perceive them as events that precede it. Objects don't change one at a time; they act concurrently. acting concurrently---all at once. So it is often natural to model Computation could take advantage of this by letting systems as collections of computational processes that execute concurrently. Just as we can make our programs modular by But this raises issues of controlling interactions organizing models in terms of objects with separate local state, it is often appropriate to divide computational models into parts that evolve separately and concurrently. Even if the programs are to be executed on a sequential computer, the practice of writing programs as if they were to be executed concurrently forces the 4 programmer to avoid inessential timing constraints and thus makes programs more modular In addition to making programs more modular, concurrent computation can provide a speed advantage over sequential computation. Sequential computers execute only one operation at a time, so the amount of time it takes to erform a task is proportional to the total number of operations performed. However, if it is possible to decompose a problem into pieces that are relatively independent and need to communicate only rarely, it may be possible to allocate pieces to separate computing processors, producing a speed advantage proportional to the number of processors available Unfortunately, the complexities introduced by assignment become even more problematic in the presence of concurrency. The fact of concurrent execution, either because the world operates in parallel or because our computers do, entails additional complexity in our understanding of time Today, we are going to look at those issues and ways to try to get around them6.001 Structure and Interpretation of Computer Programs. Copyright © 2004 by Massachusetts Institute of Technology. Slide 32.1.4 The apparently simple introduction of local state and mutation into our language thus has some drastic consequences: it raises questions about sameness and change. The central issue lurking beneath the complexity of state, sameness, and change is that by introducing assignment we are forced to admit time into our computational models. Before we introduced assignment, all our programs were timeless, in the sense that any expression that has a value always has the same value. Thus, calling (d1 5) would always return the same value. In contrast, look at our modeling of deposits to a bank account, that returns the resulting balance. Here successive evaluations of the same expression yield different values. This behavior arises from the fact that the execution of assignment statements (in this case, assignments to the variable balance) delineates moments in time when values change. The result of evaluating an expression depends not only on the expression itself, but also on whether the evaluation occurs before or after these moments. Building models in terms of computational objects with local state forces us to confront time as an essential concept in programming. Slide 32.1.5 We can go further in structuring computational models to match our perception of the physical world. Objects in the world do not change one at a time in sequence. Rather we perceive them as acting concurrently---all at once. So it is often natural to model systems as collections of computational processes that execute concurrently. Just as we can make our programs modular by organizing models in terms of objects with separate local state, it is often appropriate to divide computational models into parts that evolve separately and concurrently. Even if the programs are to be executed on a sequential computer, the practice of writing programs as if they were to be executed concurrently forces the programmer to avoid inessential timing constraints and thus makes programs more modular. In addition to making programs more modular, concurrent computation can provide a speed advantage over sequential computation. Sequential computers execute only one operation at a time, so the amount of time it takes to perform a task is proportional to the total number of operations performed. However, if it is possible to decompose a problem into pieces that are relatively independent and need to communicate only rarely, it may be possible to allocate pieces to separate computing processors, producing a speed advantage proportional to the number of processors available. Unfortunately, the complexities introduced by assignment become even more problematic in the presence of concurrency. The fact of concurrent execution, either because the world operates in parallel or because our computers do, entails additional complexity in our understanding of time. Today, we are going to look at those issues and ways to try to get around them
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有