Heading:
PE report (section 3b)
Page Numbers: No
TOOLS
(T1)Fast turnaround for minor program changes (<5 sec)
Our concern with fast turnaround comes from the observation that programming should be "think bound", not "compute bound". There are several "knees" (points of substantial non-linearity) in one’s perception of response delays. One such knee is in the vicinity of 3-5 seconds. We believe that it is essential to reduce the system time for minor program changes to below this point.
(T2)Compiler/interpreter available with low overhead at run time
The issues here are similar to those in (L9-10), namely, to reduce the mental and execution "gear-shifting" overhead caused by artificial divisions between compilation and execution environments. A sufficiently fast compiler is just as good as an interpreter for executing typed-in programs or programs constructed on-the-fly by other programs. However, it is essential that one be able to save some form of compiled code, for applications that embed procedures in data structures.
(T3)Cross-reference/annotation capability
(T4)Prettyprinter
These capabilities contribute substantially to the readability of programs, which in turn has a large effect on the ease of maintenance. A simple "batch" cross-reference facility is essential; more sophisticated facilities, such as those of Masterscope, are less urgent.
(T5)Consistent compilation
(T6)Version control
Consistent compilation is an efficiency issue: to get the right thing to happen without blindly recompiling and reloading everything. Version control is more fundamental. It has two major aspects: history and parameterization.
Under history, we want to be able to tell exactly how a particular system or component was constructed, and what it depends on (e.g. microcode version, interface definition or whatever). Furthermore, we want to be able to reconstruct a component automatically. This requires that every computation involved in its original construction must record all its inputs, and be prepared to repeat itself from this record. Since the inputs may be (references to) files, it is also necessary to have a naming scheme for files that is unique over the whole universe, and a guarantee that no file will ever be destroyed (unless the rule for reconstructing it is saved, together with all the required inputs).
Under parameterization, we want a systematic way of specifying the construction of a system that never existed before (e.g. it is for a new microcode version, or different implementations of the same interfaces are combined in a new way). We agreed that we don’t aspire to solve this problem in the full generality required by IBM.
Replacing code in an existing system is in principle a special case of consistent compilation -- the general question is when a complete but expensive procedure (recompilation, reloading, etc) can be bypassed. We note that replacing code is in practice not just an efficiency issue, since getting the system back to the exact current state is not possible in general. The reason is that the current state depends on user program execution, and the user program cannot be counted on to follow the rules mentioned under history, which we impose on the system programs that make up the environment.
(T7)Librarian, program-oriented filing system (including Browser)
Coordinating access by multiple maintainers to a program made up of many packages requires some automation to avoid loss of consistency or even valuable information. DeSoto takes care of updating a "Bible" version from individual versions, and vice versa; the Mesa Librarian allows programmers to "check out" (in the sense of a library book) a module for modification. A system richly endowed with packages also needs some automation to catalog them and their documentation in a way that actively aids users in finding what they need.
(T8)Source-language debugger
It is essential that the programmer be able to debug using the same language constructs and concepts used in writing the original program. This is facilitated by a minimum of distinctions between "compile-time" and "run-time" environments -- see also (L10) and (T2).
(T9)Dynamic measurement facilities
These facilities are necessary to understand the behavior of complex programs under conditions of actual use. Smalltalk and Mesa have a Spy, which works by sampling the PC, and Lisp has Breakdown. The Smalltalk and Mesa facilities are relatively little used: the Mesa facilities are poorly documented and supported, and the nature of Smalltalk is such that there is often little meaningful tuning one can do. Proposed for Mesa by SDD are
the Diamond Test Module’s facility for counting frequency and time between any pair of breakpoints,
a facility for writing an event in a log, either by procedure calls, or by action to be taken at a breakpoint, and
a "transfer trap" mechanism that logs data at every control transfer, together with some standard ways of reducing this data to produce the same kind of information that a Spy produces.
We agree that something as good as Breakdown is good enough.
(T10)Checkpoint, establishing a protected environment
Checkpointing is needed to be able to protect oneself against unexpected machine or system failures. The weakness of the facilities in all three current systems was noted: file state is not saved, nor is there any check that it hasn’t changed on a restart. "Protected environment" means the ability to install a new version of something in a system and still be able to revert to the old version very cheaply; cheap checkpoints can provide this. Cheap checkpoints can be done in a paged system with copy-on-write techniques.
(T11)History and undoing
"History" refers to the system keeping a typescript file, the ability to feed information from the typescript back to the system, and the ability to have a handle on the values returned as well. This is an attribute of the interactive interface: its value comes from the observation that the operations one performs often are similar to, or use the results of, the operations one has recently performed.
An "undoing" mechanism should cover both system-implemented actions such as edits, and a way for users to supply a procedure that will undo the effects of another specified procedure. This can be a very inexpensive alternative to checkpointing as a way to give the user the ability to experiment with alternatives without imposing the burden of manually saving and restoring the relevant state.
(T12)Editor integrated with language system
Editing is just one function of a language system, carried out using a particular sublanguage. (In fact, we believe it may be the appropriate model for the major sublanguage presented to the user at the terminal.) As such, it should be integrated with the rest of the language system in that
the user doing editing can call on arbitrary programs to compute commands or data needed for the editing process, including the ability to pass selections from the thing being edited to the computation as arguments;
any program can call on the editor as a package.
The latter seems very useful and relatively easy to achieve. We agree the former is also valuable, but there is disagreement over whether it is merely valuable or extremely important.
(T13)More optimizing compiler if user willing to bind more tightly -- with full compatibility
A recurring source of difficulty in programming is the generality/efficiency tradeoff: when it is time to tune a system for better performance, it is traditionally necessary to make major logic changes as well. An alternative is to keep the logical structure of the program the same, but have a compiler that can do the necessary rearrangements as part of the compilation process: the programmer instructs it as to what kinds of flexibility or modularity should be sacrificed, and the critical choices to be made in the representation of data structures. The Lisp "block compiler" is one example of this idea; the in-line procedure feature suggested for Mesa is another. We did not discuss this area except in passing.
(T14)Aids for incremental development (stubs, outstanding task list)
Top-down programming, or independent development of modules in a system, often benefits from the ability to replace as yet unavailable modules with "stubs" which have the same functional behavior but simpler (and presumably less efficient) implementation. This and other aids for keeping track of the status of parts of a large project have been used successfully in many system development efforts.
(T15)Regression testing system
(T16)Random testing aids
Both regression testing -- keeping a record of standard tests and results with a program module, and automatically checking them after a change to the module -- and testing with random data have proven to be worthwhile methods for checking programs too large or complex to verify or describe analytically.
(T17)(high capability) Masterscope
Any facility like Masterscope, which maintains an up-to-date data base of relations among parts of a program, must be integrated into the system at a fundamental level. Our discussion revealed that the fundamental aspect is the need for a single funnel for changes to the system (Mesa pretty much has this now, but Lisp does not). Relation to the file system was discussed, and it was agreed that manual use of the file system should be outlawed. A consequence is that the PE must do recovery at least as well as the file system does. Of course, having a reliable file system underneath makes this much easier. A variety of techniques are possible, which we did not explore in detail.
(T18)Access to on-line documentation (Helpsys)
Good on-line documentation, both for reference and for learning, can greatly reduce the need for time spent studying an enormous manual, can provide instant cross-linking of related subjects in a way that hardcopy cannot, and can use one’s current context to implicitly locate relevant material -- Lisp’s Helpsys facility is unique in these respects. However, creating and maintaining such documentation is a tremendous amount of work, even if the process is partly automated.
(T19)Static analyzers: verifier, performance predictor
We agree that, especially for programs used by many people in low-tolerance environments, an ounce of prevention is worth a pound of cure: effort expended on eliminating bugs or bottlenecks beforehand can save a lot of time and trouble locating them afterward. Unfortunately, verification technology is still unable to accommodate programs of significant size written in languages of realistic complexity, and very little has been done on deriving performance information from the program text (in contrast to analytic models of systems at a gross level, of which there are many).