5 Towards object technology usbilityand eibilityprincpale defined in the preceding chapters.To achieve these conditions,we need a systematic method for decomposing systems into modules. This chapter presents the basic elements of such a method,based on a simple but far- reaching idea:build every module on the basis of some object type.It explains the idea, develops the rationale for it,and explores some of the immediate consequences. A word of warning.Given today's apparent prominence of object technology,some readers might think that the battle has been won and that no further rationale is necessary. This would be a mistake:we need to understand the basis for the method,if only to avoid common misuses and pitfalls.It is in fact frequent to see the word "object-oriented"(like "structured"in an earlier era)used as mere veneer over the most conventional techniques. Only by carefully building the case for object technology can we learn to detect improper uses of the buzzword,and stay away from common mistakes reviewed later in this chapter. 5.1 THE INGREDIENTS OF COMPUTATION The crucial question in our search for proper software architectures is modularization: what criteria should we use to find the modules of our software? To obtain the proper answer we must first examine the contending candidates. The basic triangle Three forces are at play when we use software to perform some computations: The three forces of computation Action Object Processor
5 Towards object technology Extendibility, reusability and reliability, our principal goals, require a set of conditions defined in the preceding chapters. To achieve these conditions, we need a systematic method for decomposing systems into modules. This chapter presents the basic elements of such a method, based on a simple but farreaching idea: build every module on the basis of some object type. It explains the idea, develops the rationale for it, and explores some of the immediate consequences. A word of warning. Given today’s apparent prominence of object technology, some readers might think that the battle has been won and that no further rationale is necessary. This would be a mistake: we need to understand the basis for the method, if only to avoid common misuses and pitfalls. It is in fact frequent to see the word “object-oriented” (like “structured” in an earlier era) used as mere veneer over the most conventional techniques. Only by carefully building the case for object technology can we learn to detect improper uses of the buzzword, and stay away from common mistakes reviewed later in this chapter. 5.1 THE INGREDIENTS OF COMPUTATION The crucial question in our search for proper software architectures is modularization: what criteria should we use to find the modules of our software? To obtain the proper answer we must first examine the contending candidates. The basic triangle Three forces are at play when we use software to perform some computations: The three forces of computation Action Object Processor
102 TOWARDS OBJECT TECHNOLOGY $5.1 To execute a software system is to use certain processors to apply certain actions to certain objects. The processors are the computation devices,physical or virtual,that execute instructions.A processor can be an actual processing unit (the CPU of a computer),a process on a conventional operating system,or a "thread"if the OS is multi-threaded. The actions are the operations making up the computation.The exact form of the actions that we consider will depend on the level of granularity of our analysis:at the hardware level,actions are machine language operations;at the level of the hardware- software machine,they are instructions of the programming language;at the level of a software system,we can treat each major step of a complex algorithm as a single action. The objects are the data structures to which the actions apply.Some ofthese objects, the data structures built by a computation for its own purposes,are internal and exist only while the computation proceeds;others (contained in the files,databases and other persistent repositories)are external and may outlive individual computations. Processors will become important when we discuss concurrent forms of Concurrency is the computation,in which several sub-computations can proceed in parallel;then we will topic of chapter 30. need to consider two or more processors,physical or virtual.But that is the topic of a later chapter;for the moment we can limit our attention to non-concurrent,or seguential computations,relying on a single processor which will remain implicit. This leaves us with actions and objects.The duality between actions and objects- what a system does vs.what it does it to-is a pervasive theme in software engineering. A note of terminology.Synonyms are available to denote each of the two aspects:the word data will be used here as a synonym for objects;for action the discussion will often follow common practice and talk about the functions of a system. The term "function"is not without disadvantages,since software discussions also use it in at least two other meanings:the mathematical sense,and the programming sense of subprogram returning a result.But we can use it without ambiguity in the phrase the functions ofa system,which is what we need here. The reason for using this word rather than"action"is the mere grammatical convenience of having an associated adjective,used in the phrase functional decomposition."Action" has no comparable derivation.Another term whose meaning is equivalent to that of "action"for the purpose of this discussion is operation. Any discussion of software issues must account for both the object and function aspects;so must the design of any software system.But there is one question for which we must choose-the question of this chapter:what is the appropriate criterion for finding the modules of a system?Here we must decide whether modules will be built as units of functional decomposition,or around major types of objects. From the answer will follow the difference between the object-oriented approach and other methods.Traditional approaches build each module around some unit of functional decomposition-a certain piece of the action.The object-oriented method, instead,builds each module around some type of objects
102 TOWARDS OBJECT TECHNOLOGY §5.1 To execute a software system is to use certain processors to apply certain actions to certain objects. The processors are the computation devices, physical or virtual, that execute instructions. A processor can be an actual processing unit (the CPU of a computer), a process on a conventional operating system, or a “thread” if the OS is multi-threaded. The actions are the operations making up the computation. The exact form of the actions that we consider will depend on the level of granularity of our analysis: at the hardware level, actions are machine language operations; at the level of the hardwaresoftware machine, they are instructions of the programming language; at the level of a software system, we can treat each major step of a complex algorithm as a single action. The objects are the data structures to which the actions apply. Some of these objects, the data structures built by a computation for its own purposes, are internal and exist only while the computation proceeds; others (contained in the files, databases and other persistent repositories) are external and may outlive individual computations. Processors will become important when we discuss concurrent forms of computation, in which several sub-computations can proceed in parallel; then we will need to consider two or more processors, physical or virtual. But that is the topic of a later chapter; for the moment we can limit our attention to non-concurrent, or sequential computations, relying on a single processor which will remain implicit. This leaves us with actions and objects. The duality between actions and objects — what a system does vs. what it does it to — is a pervasive theme in software engineering. A note of terminology. Synonyms are available to denote each of the two aspects: the word data will be used here as a synonym for objects; for action the discussion will often follow common practice and talk about the functions of a system. The term “function” is not without disadvantages, since software discussions also use it in at least two other meanings: the mathematical sense, and the programming sense of subprogram returning a result. But we can use it without ambiguity in the phrase the functions of a system, which is what we need here. The reason for using this word rather than “action” is the mere grammatical convenience of having an associated adjective, used in the phrase functional decomposition. “Action” has no comparable derivation. Another term whose meaning is equivalent to that of “action” for the purpose of this discussion is operation. Any discussion of software issues must account for both the object and function aspects; so must the design of any software system. But there is one question for which we must choose — the question of this chapter: what is the appropriate criterion for finding the modules of a system? Here we must decide whether modules will be built as units of functional decomposition, or around major types of objects. From the answer will follow the difference between the object-oriented approach and other methods. Traditional approaches build each module around some unit of functional decomposition — a certain piece of the action. The object-oriented method, instead, builds each module around some type of objects. Concurrency is the topic of chapter 30
$5.2 FUNCTIONAL DECOMPOSITION 103 This book,predictably,develops the latter approach.But we should not just embrace O-O decomposition because the title of the book so implies,or because it is the"in"thing to do.The next few sections will carefully examine the arguments that justify using object types as the basis for modularization-starting with an exploration of the merits and limitations of traditional,non-O-O methods.Then we will try to get a clearer understanding of what the word"object"really means for software development,although the full answer,requiring a little theoretical detour,will only emerge in the next chapter. We will also have to wait until the next chapter for the final settlement of the formidable and ancient fight that provides the theme for the rest of the present discussion: the War of the Objects and the Functions.As we prepare ourselves for a campaign of slander against the functions as a basis for system decomposition,and of corresponding praise for the objects,we must not forget the observation made above:in the end,our solution to the software structuring problem must provide space for both functions and objects-although not necessarily on an equal basis.To discover this new world order, we will need to define the respective roles of its first-class and second-class citizens. 5.2 FUNCTIONAL DECOMPOSITION We should first examine the merits and limitations of the traditional approach:using functions as a basis for the architecture of software systems.This will not only lead us to appreciate why we need something else-object technology-but also help us avoid, when we do move into the object world,certain methodological pitfalls such as premature operation ordering,which have been known to fool even experienced O-O developers. Continuity “Modular contim- A key element in answering the question "should we structure systems around functions iy”,page44. or around data?"is the problem of extendibility,and more precisely the goal called continuity in our earlier discussions.As you will recall,a design method satisfies this criterion if it yields stable architectures,keeping the amount of design change commensurate with the size of the specification change. Continuity is a crucial concern if we consider the real lifecycle of software systems, including not just the production of an acceptable initial version,but a system's long-term evolution.Most systems undergo numerous changes after their first delivery.Any model of software development that only considers the period leading to that delivery and ignores the subsequent era of change and revision is as remote from real life as those novels which end when the hero marries the heroine-the time which,as everyone knows,marks the beginning of the really interesting part. To evaluate the quality of an architecture (and of the method that produced it),we should not just consider how easy it was to obtain this architecture initially:it is just as important to ascertain how well the architecture will weather change. Top-down design was sketched in The traditional answer to the question of modularization has been top-down “Modular decom- functional decomposition,briefly introduced in an earlier chapter.How well does top- posability".page 40.down design respond to the requirements of modularity?
§5.2 FUNCTIONAL DECOMPOSITION 103 This book, predictably, develops the latter approach. But we should not just embrace O-O decomposition because the title of the book so implies, or because it is the “in” thing to do. The next few sections will carefully examine the arguments that justify using object types as the basis for modularization — starting with an exploration of the merits and limitations of traditional, non-O-O methods. Then we will try to get a clearer understanding of what the word “object” really means for software development, although the full answer, requiring a little theoretical detour, will only emerge in the next chapter. We will also have to wait until the next chapter for the final settlement of the formidable and ancient fight that provides the theme for the rest of the present discussion: the War of the Objects and the Functions. As we prepare ourselves for a campaign of slander against the functions as a basis for system decomposition, and of corresponding praise for the objects, we must not forget the observation made above: in the end, our solution to the software structuring problem must provide space for both functions and objects — although not necessarily on an equal basis. To discover this new world order, we will need to define the respective roles of its first-class and second-class citizens. 5.2 FUNCTIONAL DECOMPOSITION We should first examine the merits and limitations of the traditional approach: using functions as a basis for the architecture of software systems. This will not only lead us to appreciate why we need something else — object technology — but also help us avoid, when we do move into the object world, certain methodological pitfalls such as premature operation ordering, which have been known to fool even experienced O-O developers. Continuity A key element in answering the question “should we structure systems around functions or around data?” is the problem of extendibility, and more precisely the goal called continuity in our earlier discussions. As you will recall, a design method satisfies this criterion if it yields stable architectures, keeping the amount of design change commensurate with the size of the specification change. Continuity is a crucial concern if we consider the real lifecycle of software systems, including not just the production of an acceptable initial version, but a system’s long-term evolution. Most systems undergo numerous changes after their first delivery. Any model of software development that only considers the period leading to that delivery and ignores the subsequent era of change and revision is as remote from real life as those novels which end when the hero marries the heroine — the time which, as everyone knows, marks the beginning of the really interesting part. To evaluate the quality of an architecture (and of the method that produced it), we should not just consider how easy it was to obtain this architecture initially: it is just as important to ascertain how well the architecture will weather change. The traditional answer to the question of modularization has been top-down functional decomposition, briefly introduced in an earlier chapter. How well does topdown design respond to the requirements of modularity? “Modular continuity”, page 44. Top-down design was sketched in “Modular decomposability”, page 40
104 TOWARDS OBJECT TECHNOLOGY $5.2 Top-down development There was a most ingenious architect who had contrived a new method for building houses,by beginning at the roof,and working downwards to the foundation,which he justified to me by the like practice of those two prudent insects,the bee and the spider. Jonathan Swift:Gulliver's Travels,Part III,A Voyage to Laputa,etc.,Chapter 5. The top-down approach builds a system by stepwise refinement,starting with a definition of its abstract function.You start the process by expressing a topmost statement of this function,such as [Co] "Translate a C program to machine code" or: [PO] "Process a user command" and continue with a sequence of refinement steps.Each step must decrease the level of abstraction of the elements obtained;it decomposes every operation into a combination of one or more simpler operations.For example,the next step in the first example (the C compiler)could produce the decomposition [C1] "Read program and produce sequence of tokens" "Parse sequence of tokens into abstract syntax tree" “Decorate tree with semantic information” "Generate code from decorated tree" or,using an alternative structure(and making the simplifying assumption that a C program is a sequence of function definitions): [C' from "Initialize data structures" until “All function definitions processed” loop “Read in next function definition” "Generate partial code" end “Fill in cross references
104 TOWARDS OBJECT TECHNOLOGY §5.2 Top-down development There was a most ingenious architect who had contrived a new method for building houses, by beginning at the roof, and working downwards to the foundation, which he justified to me by the like practice of those two prudent insects, the bee and the spider. Jonathan Swift: Gulliver’s Travels, Part III, A Voyage to Laputa, etc., Chapter 5. The top-down approach builds a system by stepwise refinement, starting with a definition of its abstract function. You start the process by expressing a topmost statement of this function, such as [C0] “Translate a C program to machine code” or: [P0] “Process a user command” and continue with a sequence of refinement steps. Each step must decrease the level of abstraction of the elements obtained; it decomposes every operation into a combination of one or more simpler operations. For example, the next step in the first example (the C compiler) could produce the decomposition [C1] “Read program and produce sequence of tokens” “Parse sequence of tokens into abstract syntax tree” “Decorate tree with semantic information” “Generate code from decorated tree” or, using an alternative structure (and making the simplifying assumption that a C program is a sequence of function definitions): [C'1] from “Initialize data structures” until “All function definitions processed” loop “Read in next function definition” “Generate partial code” end “Fill in cross references
$5.2 FUNCTIONAL DECOMPOSITION 105 In either case,the developer must at each step examine the remaining incompletely expanded elements (such as"Read program..."and"All function definitions processed") and expand them,using the same refinement process,until everything is at a level of abstraction low enough to allow direct implementation. We may picture the process of top-down refinement as the development of a tree. Nodes represent elements of the decomposition;branches show the relation "B is part of the refinement of”. Top-down Topmost functional abstraction design:tree structure Sequence (This figure first B appeared on page 41.) Loop Conditional 2 The top-down approach has a number of advantages.It is a logical,well-organized thought discipline;it can be taught effectively;it encourages orderly development of systems;it helps the designer find a way through the apparent complexity that systems often present at the initial stages of their design. The top-down approach can indeed be useful for developing individual algorithms. But it also suffers from limitations that make it questionable as a tool for the design of entire systems: The very idea of characterizing a system by just one function is subject to doubt. By using as a basis for modular decomposition the properties that tend to change the most,the method fails to account for the evolutionary nature of software systems. Not just one function In the evolution of a system,what may originally have been perceived as the system's main function may become less important over time. Consider a typical payroll system.When stating his initial requirement,the customer may have envisioned just what the name suggests:a system to produce paychecks from the appropriate data.His view of the system,implicit or explicit,may have been a more am bitious version of this:
§5.2 FUNCTIONAL DECOMPOSITION 105 In either case, the developer must at each step examine the remaining incompletely expanded elements (such as “Read program …” and “All function definitions processed”) and expand them, using the same refinement process, until everything is at a level of abstraction low enough to allow direct implementation. We may picture the process of top-down refinement as the development of a tree. Nodes represent elements of the decomposition; branches show the relation “B is part of the refinement of A”. The top-down approach has a number of advantages. It is a logical, well-organized thought discipline; it can be taught effectively; it encourages orderly development of systems; it helps the designer find a way through the apparent complexity that systems often present at the initial stages of their design. The top-down approach can indeed be useful for developing individual algorithms. But it also suffers from limitations that make it questionable as a tool for the design of entire systems: • The very idea of characterizing a system by just one function is subject to doubt. • By using as a basis for modular decomposition the properties that tend to change the most, the method fails to account for the evolutionary nature of software systems. Not just one function In the evolution of a system, what may originally have been perceived as the system’s main function may become less important over time. Consider a typical payroll system. When stating his initial requirement, the customer may have envisioned just what the name suggests: a system to produce paychecks from the appropriate data. His view of the system, implicit or explicit, may have been a more ambitious version of this: Top-down design: tree structure (This figure first appeared on page 41.) A B D C C1 I I1 C2 I2 Sequence Loop Conditional Topmost functional abstraction
106 TOWARDS OBJECT TECHNOLOGY $5.2 Structure of a Employee Information simple payroll Produce Paychecks Paychecks program Hours Worked The system takes some inputs (such as record of hours worked and employee information)and produces some outputs(paychecks and so on).This is a simple enough functional specification,in the strict sense of the word functional:it defines the program as a mechanism to perform one function-pay the employees.The top-down functional method is meant precisely for such well-defined problems,where the task is to perform a single function一the“top”of the system to be built. Assume,however,that the development of our payroll program is a success:the program does the requisite job.Most likely,the development will not stop there.Good systems have the detestable habit of giving their users plenty of ideas about all the other things they could do.As the system's developer,you may initially have been told that all you had to do was to generate paychecks and a few auxiliary outputs.But now the requests for extensions start landing on your desk:Could the program gather some statistics on the side?I did tell you that next quarter we are going to start paying some employees monthly and others biweekly,did I not?And,by the way,I need a summary every month for management,and one every quarter for the shareholders.The accountants want their own output for tax preparation purposes.Also,you are keeping all this salary information, right?It would be really nifty to let Personnel access it interactively.I cannot imagine why that would be a difficult functionality to add. This phenomenon of having to add unanticipated functions to successful systems occurs in all application areas.A nuclear code that initially just applied some algorithm to produce tables of numbers from batch input will be extended to handle graphical input and output or to maintain a database of previous results.A compiler that just translated valid source into object code will after a while double up as a syntax verifier,a static analyzer, a pretty-printer,even a programming environment. This change process is often incremental.The new requirements evolve from the initial ones in a continuous way.The new system is still,in many respects,"the same system"as the old one:still a payroll system,a nuclear code,a compiler.But the original "main function",which may have seemed so important at first,often becomes just one of many functions;sometimes,it just vanishes,having outlived its usefulness. If analysis and design have used a decomposition method based on the function,the system structure will follow from the designers'original understanding of the system's main function.As the system evolves,the designers may feel sorry (or its maintainers,if different people,may feel angry)about that original assessment.Each addition of a new function, however incremental it seems to the customer,risks invalidating the entire structure. It is crucial to find,as a criterion for decomposition,properties less volatile than the system's main function
106 TOWARDS OBJECT TECHNOLOGY §5.2 The system takes some inputs (such as record of hours worked and employee information) and produces some outputs (paychecks and so on). This is a simple enough functional specification, in the strict sense of the word functional: it defines the program as a mechanism to perform one function — pay the employees. The top-down functional method is meant precisely for such well-defined problems, where the task is to perform a single function — the “top” of the system to be built. Assume, however, that the development of our payroll program is a success: the program does the requisite job. Most likely, the development will not stop there. Good systems have the detestable habit of giving their users plenty of ideas about all the other things they could do. As the system’s developer, you may initially have been told that all you had to do was to generate paychecks and a few auxiliary outputs. But now the requests for extensions start landing on your desk: Could the program gather some statistics on the side? I did tell you that next quarter we are going to start paying some employees monthly and others biweekly, did I not? And, by the way, I need a summary every month for management, and one every quarter for the shareholders. The accountants want their own output for tax preparation purposes. Also, you are keeping all this salary information, right? It would be really nifty to let Personnel access it interactively. I cannot imagine why that would be a difficult functionality to add. This phenomenon of having to add unanticipated functions to successful systems occurs in all application areas. A nuclear code that initially just applied some algorithm to produce tables of numbers from batch input will be extended to handle graphical input and output or to maintain a database of previous results. A compiler that just translated valid source into object code will after a while double up as a syntax verifier, a static analyzer, a pretty-printer, even a programming environment. This change process is often incremental. The new requirements evolve from the initial ones in a continuous way. The new system is still, in many respects, “the same system” as the old one: still a payroll system, a nuclear code, a compiler. But the original “main function”, which may have seemed so important at first, often becomes just one of many functions; sometimes, it just vanishes, having outlived its usefulness. If analysis and design have used a decomposition method based on the function, the system structure will follow from the designers’ original understanding of the system’s main function. As the system evolves, the designers may feel sorry (or its maintainers, if different people, may feel angry) about that original assessment. Each addition of a new function, however incremental it seems to the customer, risks invalidating the entire structure. It is crucial to find, as a criterion for decomposition, properties less volatile than the system’s main function. Employee Hours Paychecks Produce Paychecks Information Worked Structure of a simple payroll program
$5.2 FUNCTIONAL DECOMPOSITION 107 Finding the top Top-down methods assume that every system is characterized,at the most abstract level, by its main function.Although it is indeed easy to specify textbook examples of algorithmic problems-the Tower of Hanoi,the Eight Queens and the like-through their functional "tops",a more useful description of practical software systems considers each of them as offering a number of services.Defining such a system by a single function is usually possible,but yields a rather artificial view. Take an operating system.It is best understood as a system that provides certain services:allocating CPU time,managing memory,handling input and output devices, decoding and carrying out users'commands.The modules of a well-structured OS will tend to organize themselves around these groups of functions.But this is not the architecture that you will get from top-down functional decomposition;the method forces you,as the designer,to answer the artificial question"what is the topmost function?",and then to use the successive refinements of the answer as a basis for the structure.If hard pressed you could probably come up with an initial answer of the form "Process all user requests" which you could then refine into something like from boot until halted or crashed loop "Read in a user's request and put it into input queue" “Get a requestrfrom input queue” “Process r" "Put result into output queue" “Get a result o from output queue'” “Output o to its recipient'” end Refinements can go on.From such premises,however,it is unlikely that anyone can ever develop a reasonably structured operating system. Even systems which may at first seem to belong to the "one input,one abstract function,one output"category reveal,on closer examination,a more diverse picture. Consider the earlier example of a compiler.Reduced to its bare essentials,or to the view of older textbooks,a compiler is the implementation of one input-to-output function: transforming source text in some programming language into machine code for a certain platform.But that is not a sufficient view of a modern compiler.Among its many services, a compiler will perform error detection,program formating,some configuration management,logging,report generation
§5.2 FUNCTIONAL DECOMPOSITION 107 Finding the top Top-down methods assume that every system is characterized, at the most abstract level, by its main function. Although it is indeed easy to specify textbook examples of algorithmic problems — the Tower of Hanoi, the Eight Queens and the like — through their functional “tops”, a more useful description of practical software systems considers each of them as offering a number of services. Defining such a system by a single function is usually possible, but yields a rather artificial view. Take an operating system. It is best understood as a system that provides certain services: allocating CPU time, managing memory, handling input and output devices, decoding and carrying out users’ commands. The modules of a well-structured OS will tend to organize themselves around these groups of functions. But this is not the architecture that you will get from top-down functional decomposition; the method forces you, as the designer, to answer the artificial question “what is the topmost function?”, and then to use the successive refinements of the answer as a basis for the structure. If hard pressed you could probably come up with an initial answer of the form “Process all user requests” which you could then refine into something like from boot until halted or crashed loop “Read in a user’s request and put it into input queue” “Get a request r from input queue” “Process r” “Put result into output queue” “Get a result o from output queue” “Output o to its recipient” end Refinements can go on. From such premises, however, it is unlikely that anyone can ever develop a reasonably structured operating system. Even systems which may at first seem to belong to the “one input, one abstract function, one output” category reveal, on closer examination, a more diverse picture. Consider the earlier example of a compiler. Reduced to its bare essentials, or to the view of older textbooks, a compiler is the implementation of one input-to-output function: transforming source text in some programming language into machine code for a certain platform. But that is not a sufficient view of a modern compiler. Among its many services, a compiler will perform error detection, program formating, some configuration management, logging, report generation
108 TOWARDS OBJECT TECHNOLOGY $5.2 Another example is a typesetting program,taking input in some text processing format-TEX,Microsoft Word,FrameMaker...-and generating output in HTML, Postscript or Adobe Acrobat format.Again we may view it at first as just an input-to- output filter.But most likely it will perform a number ofother services as well,so it seems more interesting,when we are trying to characterize the system in the most general way, to consider the various types of data it manipulates:documents,chapters,sections, paragraphs,lines,words,characters,fonts,running heads,titles,figures and others. The seemingly obvious starting point of top-down design-the view that each new development fulfills a request for a specific function-is subject to doubt: Real systems have no top. Functions and evolution Not only is the main function often not the best criterion to characterize a system initially: it may also,as the system evolves,be among the first properties to change,forcing the top-down designer into frequent redesign and defeating our attempts to satisfy the continuity requirement. Consider the example of a program that has two versions,a "batch"one which handles every session as a single big run over the problem,and an interactive one in which a session is a sequence of transactions,with a much finer grain of user-system communication.This is typical of large scientific programs,which often have a"let it run a big chunk of computation for the whole night"version and a"let me try out a few things and see the results at once then continue with something else"version. The top-down refinement of the batch version might begin as [BO]--Top-level abstraction “Solve a complete instance of the problem” [B1]--First refinement “Read input values" “Compute results” “Output results'” and so on.The top-down development of the interactive version,for its part,could proceed in the following style:
108 TOWARDS OBJECT TECHNOLOGY §5.2 Another example is a typesetting program, taking input in some text processing format — TEX, Microsoft Word, FrameMaker … — and generating output in HTML, Postscript or Adobe Acrobat format. Again we may view it at first as just an input-tooutput filter. But most likely it will perform a number of other services as well, so it seems more interesting, when we are trying to characterize the system in the most general way, to consider the various types of data it manipulates: documents, chapters, sections, paragraphs, lines, words, characters, fonts, running heads, titles, figures and others. The seemingly obvious starting point of top-down design — the view that each new development fulfills a request for a specific function — is subject to doubt: Functions and evolution Not only is the main function often not the best criterion to characterize a system initially: it may also, as the system evolves, be among the first properties to change, forcing the top-down designer into frequent redesign and defeating our attempts to satisfy the continuity requirement. Consider the example of a program that has two versions, a “batch” one which handles every session as a single big run over the problem, and an interactive one in which a session is a sequence of transactions, with a much finer grain of user-system communication. This is typical of large scientific programs, which often have a “let it run a big chunk of computation for the whole night” version and a “let me try out a few things and see the results at once then continue with something else” version. The top-down refinement of the batch version might begin as [B0] -- Top-level abstraction “Solve a complete instance of the problem” [B1] -- First refinement “Read input values” “Compute results” “Output results” and so on. The top-down development of the interactive version, for its part, could proceed in the following style: Real systems have no top
$5.2 FUNCTIONAL DECOMPOSITION 109 ] "Process one transaction" 2] if "New information provided by the user"then “Input information” “Store it” elseif "Request for information previously given"then “Retrieve requested information” “Output it” elseif "Request for result"then if"Necessary information available"then "Retrieve requested result" Output it'” else "Ask for confirmation of the request" if Yes then “Obtain required information” "Compute requested result" “Output result' end end else (Etc.) Started this way,the development will yield an entirely different result.The top- down approach fails to account for the property that the final programs are but two different versions of the same software system-whether they are developed concurrently or one has evolved from the other. This example brings to light two of the most unpleasant consequences of the top- down approach:its focus on the external interface (implying here an early choice between batch and interactive)and its premature binding of temporal relations(the order in which actions will be executed). Interfaces and software design System architecture should be based on substance,not form.But top-down development tends to use the most superficial aspect of the system-its external interface-as a basis for its structure. The focus on external interfaces is inevitable in a method that asks "What will the system do for the end user?"as the key question:the answer will tend to emphasize the most external aspects
§5.2 FUNCTIONAL DECOMPOSITION 109 [I1] “Process one transaction” [I2] if “New information provided by the user” then “Input information” “Store it” elseif “Request for information previously given” then “Retrieve requested information” “Output it” elseif “Request for result” then if “Necessary information available” then “Retrieve requested result” “Output it” else “Ask for confirmation of the request” if Yes then “Obtain required information” “Compute requested result” “Output result” end end else (Etc.) Started this way, the development will yield an entirely different result. The topdown approach fails to account for the property that the final programs are but two different versions of the same software system — whether they are developed concurrently or one has evolved from the other. This example brings to light two of the most unpleasant consequences of the topdown approach: its focus on the external interface (implying here an early choice between batch and interactive) and its premature binding of temporal relations (the order in which actions will be executed). Interfaces and software design System architecture should be based on substance, not form. But top-down development tends to use the most superficial aspect of the system — its external interface — as a basis for its structure. The focus on external interfaces is inevitable in a method that asks “What will the system do for the end user?” as the key question: the answer will tend to emphasize the most external aspects
110 TOWARDS OBJECT TECHNOLOGY $5.2 The user interface is only one of the components of a system.Often,it is also among the most volatile,if only because of the difficulty of getting it right the first time;initial versions may be of the mark,requiring experimentation and user feedback to obtain a satisfactory solution.A healthy design method will try to separate the interface from the rest of the system,using more stable properties as the basis for system structuring. It is in fact often possible to build the interface separately from the rest of the system, Chapter 32 dis- using one of the many tools available nowadays to produce elegant and user-friendly cusses techniques and tools for user interfaces,often based on object-oriented techniques.The user interface then becomes interfaces. almost irrelevant to the overall system design. Premature ordering The preceding examples illustrate another drawback of top-down functional decomposition:premature emphasis on temporal constraints.Each refinement expands a piece of the abstract structure into a more detailed control architecture,specifying the order in which various functions (various pieces of the action)will be executed.Such ordering constraints become essential properties of the system architecture;but they too are subject to change. Recall the two alternative candidate structures for the first refinement of a compiler: [C1] “Read program and produce sequence of tokens” "Parse sequence of tokens into abstract syntax tree" "Decorate tree with semantic information" “Generate code from decorated tree'” [C1] from "Initialize data structures" until “All function definitions processed” loop “Read in next function definition” "Generate partial code" end “Fill in cross references'” As in the preceding example we start with two completely different architectures. Each is defined by a control structure (a sequence of instructions in the first case,a loop followed by an instruction in the second),implying strict ordering constraints between the elements of the structure.But freezing such ordering relations at the earliest stages of design is not reasonable.Issues such as the number of passes in a compiler and the sequencing of various activities (lexical analysis,parsing,semantic processing, optimization)have many possible solutions,which the designers must devise by considering space-time tradeoffs and other criteria which they do not necessarily master
110 TOWARDS OBJECT TECHNOLOGY §5.2 The user interface is only one of the components of a system. Often, it is also among the most volatile, if only because of the difficulty of getting it right the first time; initial versions may be of the mark, requiring experimentation and user feedback to obtain a satisfactory solution. A healthy design method will try to separate the interface from the rest of the system, using more stable properties as the basis for system structuring. It is in fact often possible to build the interface separately from the rest of the system, using one of the many tools available nowadays to produce elegant and user-friendly interfaces, often based on object-oriented techniques. The user interface then becomes almost irrelevant to the overall system design. Premature ordering The preceding examples illustrate another drawback of top-down functional decomposition: premature emphasis on temporal constraints. Each refinement expands a piece of the abstract structure into a more detailed control architecture, specifying the order in which various functions (various pieces of the action) will be executed. Such ordering constraints become essential properties of the system architecture; but they too are subject to change. Recall the two alternative candidate structures for the first refinement of a compiler: [C1] “Read program and produce sequence of tokens” “Parse sequence of tokens into abstract syntax tree” “Decorate tree with semantic information” “Generate code from decorated tree” [C'1] from “Initialize data structures” until “All function definitions processed” loop “Read in next function definition” “Generate partial code” end “Fill in cross references” As in the preceding example we start with two completely different architectures. Each is defined by a control structure (a sequence of instructions in the first case, a loop followed by an instruction in the second), implying strict ordering constraints between the elements of the structure. But freezing such ordering relations at the earliest stages of design is not reasonable. Issues such as the number of passes in a compiler and the sequencing of various activities (lexical analysis, parsing, semantic processing, optimization) have many possible solutions, which the designers must devise by considering space-time tradeoffs and other criteria which they do not necessarily master Chapter 32 discusses techniques and tools for user interfaces