Page Numbers: Yes X: 552 Y: 10.5" First Page: 12
Columns: 2 Edge Margin: 60 Between Columns: 25
Margins: Top: 75 Bottom: 60
Line Numbers: No Modulus: 5 Page-relative
Even Heading:
REQUIREMENTS FOR AN EXPERIMENTAL PROGRAMMING ENVIRONMENT
Odd Heading:
CATALOG OF PROGRAMMING ENVIRONMENT CAPABILITIES
(T2)Compiler/interpreter available with low overhead at run time
The issues here are similar to those in (L9-10), namely, to reduce the mental and execution ‘‘gear-shifting’’ overhead caused by artificial divisions between compilation and execution environments. A sufficiently fast compiler is just as good as an interpreter for executing typed-in programs or programs constructed on-the-fly by other programs. However, it is essential that one be able to save some form of compiled code, for applications that embed procedures in data structures.
(T3)Cross-reference/annotation capability
(T4)Prettyprinter
These capabilities contribute substantially to the readability of programs, which in turn has a large effect on the ease of maintenance. A simple ‘‘batch’’ cross-reference facility is essential; more sophis-ticated facilities, such as those of Masterscope, are less urgent.
(T5)Consistent compilation
(T6)Version control
Consistent compilation is an efficiency issue: to get the right thing to happen without blindly recompiling and reloading everything. Version control is more fundamental. It has two major aspects: history and parameterization.
Under history, we want to be able to tell exactly how a particular system or component was constructed, and what it depends on (e.g., microcode version, interface definition or whatever). Further-more, we want to be able to reconstruct a com-ponent automatically. This requires that every computation involved in its original construction must record all its inputs, and be prepared to repeat itself from this record. Since the inputs may be (references to) files, it is also necessary to have a naming scheme for files that is unique over the whole universe, and a guarantee that no file will ever be destroyed (unless the rule for reconstructing it is saved, together with all the required inputs).
Under parameterization, we want a systematic way of specifying the construction of a system that never existed before (e.g., it is for a new microcode version, or different implementations of the same interfaces are combined in a new way). We agreed that we don’t aspire to solve this problem in the full generality required by IBM.
Replacing code in an existing system is in principle a special case of consistent compilation— the general question is when a complete but expensive procedure (recompilation, reloading, etc.) can be bypassed. We note that replacing code is in practice not just an efficiency issue, since getting the system back to the exact current state is not possible in general. The reason is that the current state depends on user program execution, and the user program cannot be counted on to follow the rules mentioned under history, which we impose on the system programs that make up the environment.
(T7)Librarian, program-oriented filing system (including Browser)
Coordinating access by multiple maintainers to a program made up of many packages requires some automation to avoid loss of consistency or even valuable information. For example, a librarian allows programmers to ‘‘check out’’ (in the sense of a library book) a module for modification; other tools can assist in re-integrating versions of modules that have been modified separately. A system richly endowed with packages also needs some automation to catalog them and their documentation in a way that actively aids users in finding what they need.
(T8)Source-language debugger
It is essential that the programmer be able to debug using the same language constructs and concepts used in writing the original program. This is facilitated by a minimum of distinctions between compile-time and run-time environments—see also (L10) and (T2).
(T9)Dynamic measurement facilities
These facilities are necessary to understand the behavior of complex programs under conditions of actual use. Smalltalk and Mesa have a ‘‘Spy,’’ which works by sampling the program counter, and Lisp has Breakdown. The Smalltalk and Mesa facilities are relatively little used: the Mesa facilities are poorly documented and supported, and the nature of Smalltalk is such that there is often little meaningful tuning one can do. Available for Mesa programs are
a facility for counting frequency and time between any pair of breakpoints;
a facility for writing an event in a log, either by procedure calls, or by action to be taken at a breakpoint; and
a ‘‘transfer trap’’ mechanism that logs data at every control transfer, together with some standard ways of reducing this data to produce the same kind of information that a Spy produces.
We agree that something as good as Breakdown is good enough.
(T10)Checkpoint, establishing a protected environment
Checkpointing is needed to be able to protect oneself against unexpected machine or system failures. The weakness of the facilities in all three current systems was noted: file state is not saved, nor is there any check that it hasn’t changed on a restart. Protected environment means the ability to install a new version of something in a system and still be able to revert to the old version very cheaply; cheap checkpoints can provide this. Cheap checkpoints can be done in a paged system with copy-on-write techniques.
(T11)History and undoing
History refers to the system keeping a typescript file, the ability to feed information from the typescript back to the system, and the ability to have a handle on the values returned as well. This is an attribute of the interactive interface: its value comes from the observation that the operations one performs often are similar to, or use the results of, the operations one has recently performed.
An undoing mechanism should cover both system-implemented actions such as edits, and a way for users to supply a procedure that will undo the effects of another specified procedure. This can be a very inexpensive alternative to checkpointing as a way to give the user the ability to experiment with alternatives without imposing the burden of manually saving and restoring the relevant state.
(T12)Editor integrated with language system
Editing is just one function of a language system, carried out using a particular sublanguage. (In fact, we believe it may be the appropriate model for the major sublanguage presented to the user at the terminal.) As such, it should be integrated with the rest of the language system in that
the user doing editing can call on arbitrary programs to compute commands or data needed for the editing process, including the ability to pass selections from the thing being edited to the computation as arguments;
any program can call on the editor as a package.
The latter seems very useful and relatively easy to achieve. We agree the former is also valuable, but there is disagreement over whether it is merely valuable or extremely important.
(T13)More optimizing compiler if user willing to bind more tightly—with full compatibility
A recurring source of difficulty in programming is the generality/efficiency tradeoff: when it is time to tune a system for better performance, it is traditionally necessary to make major logic changes as well. An alternative is to keep the logical structure of the program the same, but have a compiler that can do the necessary rearrangements as part of the compilation process: the programmer instructs it as to what kinds of flexibility or modularity should be sacrificed, and the critical choices to be made in the representation of data structures. The Lisp ‘‘block compiler’’ is one example of this idea; the Mesa in-line procedure is another. We did not discuss this area except in passing.
(T14)Aids for incremental development (stubs, outstanding task list)
Top-down programming, or independent devel-opment of modules in a system, often benefits from the ability to replace as yet unavailable modules with stubs which have the same functional behavior but simpler (and presumably less efficient) implementation. This and other aids for keeping track of the status of parts of a large project have been used successfully in many system development efforts.
(T15)Regression testing system
(T16)Random testing aids
Both regression testing—keeping a record of standard tests and results with a program module, and automatically checking them after a change to the module—and testing with random data have proven to be worthwhile methods for checking programs too large or complex to verify or describe analytically.
(T17)(high capability) Masterscope
Any facility like Masterscope, which maintains an up-to-date data base of relations among parts of a program, must be integrated into the system at a fundamental level. Our discussion revealed that the fundamental aspect is the need for a single funnel for changes to the system (Mesa pretty much has this now, but Lisp does not). Relation to the file system was discussed, and it was agreed that manual use of the file system should be outlawed. A consequence is that the programming environment must do recovery at least as well as the file system does. Of course, having a reliable file system under-neath makes this much easier. A variety of techniques are possible, which we did not explore in detail.
(T18)Access to on-line documentation (Helpsys)
Good on-line documentation, both for reference and for learning, can greatly reduce the need for time spent studying an enormous manual, can provide instant cross-linking of related subjects in a way that hardcopy cannot, and can use one’s current context to implicitly locate relevant material—Lisp’s Helpsys facility is unique in these respects. However, creating and maintaining such documen-tation is a tremendous amount of work, even if the process is partly automated.
(T19)Static analyzers: verifier, performance predictor
We agree that, especially for programs used by many people in low-tolerance environments, an ounce of prevention is worth a pound of cure: effort expended on eliminating bugs or bottlenecks beforehand can save a lot of time and trouble locating them afterward. Unfortunately, verification technology is still unable to accommodate programs of significant size written in languages of realistic complexity, and very little has been done on deriving performance information from the program text (in contrast to analytic models of systems at a gross level, of which there are many).
2.4. Packages
Beyond the virtual machine and programming language, which are forced on all users, and the tools, which should be applicable to all users, the quality of a programming environment is largely determined by the presence of packages that provide functional capabilities useful to many applications. We cannot stress too strongly that, from our experience, the only way to ensure the necessary high quality for such packages is to have a very small group (one to four people) with the final authority and responsibility for deciding which packages are to be incorporated in the system in a way that makes them readily available to all.
(P1)Text objects and images
(P2)Line objects and images
(P3)Scanned (bitmap) objects and images
(P4)Formatted document files
The manipulation of images is of primary concern to us in our experimental systems. We can divide these manipulations into two categories:
Manipulation of abstract objects such as formatted documents, forms, line drawings, and continuous-tone images. The operations on these objects are defined by the semantics of the objects, not by their representation on a medium.
Manipulation of the images of these objects on displays or printers. These operations must take the nature of the medium into account.
In the first category we might find editing operations on document files such as insert, replace, search; on drawings and pictures such as scale, rotate, reflect, clip, shade, connect points with a spline curve. In the second we find operations for mapping objects onto media in a variety of ways, some of which must be reversible (e.g., when a user makes a selection in the displayed image of a document, that selection really refers to the data in the document itself).
We believe that enough experience has been gained in these areas that it is possible to construct packages that will be useful in a wide range of programs, and that will markedly decrease the effort required to write programs that use them.
An interesting area that we have not discussed per se is the general notion of annotation of documents (or data structures): formatting information can be considered an annotation to the text, comments to a program, meta-descriptions in the KRL sense to a slot or another description. Pilot, for example, provides a notion of subsequences of a byte stream, which can easily be used to represent formatting information in a way that uninterested programs can ignore.
(P5)More elaborate screen management
In addition to the basic screen management capabilities mentioned under (L18), there are some additional facilities (scrollable windows, for example) that many programs will want to share. Again, we believe there will be a payoff from the presence of some carefully designed packages in the environment.
(P6)Remote file storage
The manual transfer of files between machines is a significant source of errors and wasted time. Such transfers are necessary either because of space problems or because one machine has a capability (such as a printer or high-performance display) not possessed by all.
(P7)Small data base manager
As a goal, we believe that the well-integrated access to large data bases mentioned under (L3) has a potentially enormous payoff, since many tools as well as experimental programs will benefit from it. However, if it turns out that we can’t figure out how to provide this, then we will need a well-designed data base package for managing locally stored data.
(P8)Message transmission system
Message transmission is a useful paradigm for many kinds of inter-machine communication.
(P9)Remote procedure call
The ability to call a procedure on another machine as though it were on one’s local machine is a different, less well understood communication paradigm.
(P10)Event logging
Event logging is a useful technique for redundancy and crash protection, for gathering statistics, and for reducing the cost of updating a data base in response to events affecting it.
(P11)Background processing
In an interactive system with enough real memory, both external communication (sending and receiving mail, printing) and computation (recom-pilation, Masterscope data base maintenance) can make effective use of time when the user is thinking.
(P12)Generalized cache
Many applications can benefit from a cache mechanism that provides local copies of more remote data, e.g., copies in memory of data from the disk, or copies on a local machine of files stored remotely. A package could keep track of which items had been used least recently, schedule rewriting of changed items, and deal with locks and timeouts.
(P13)Document editing
Underlying document editing and manipulation facilities have been re-implemented time after time, because insufficient thought was given to organizing them as a general-purpose package. There is no good technical reason for this.
(P14)Forms
(P15)Menus and other standard user interfaces
Packages that provide standard user interface tools such as forms, menus, selection, etc. are desirable both in the interests of uniformity and simply to save work.
(P16)History lists
Programs should be able to take advantage of the same mechanisms used by the system to provide the history and undoing capabilities discussed in (T11).
(P17)User access to full bandwidth of disk
Data base manipulations and code overlaying require brief bursts of high-bandwidth disk activity. The system should not prevent the programmer from using the disk’s full bandwidth, and a package should make it easy.
(P18)(English) dictionary service
Office applications involving documents can benefit from easy access to an English dictionary (for spelling correction, hyphenation, and thesaurus applications, for example).
(P19)Teleconferencing
Inter-person communcation should play more of a role in our future experiments; we need a package to handle the mechanics of keeping several users’ views of the screen, cursor, etc., consistent.
(P20)Audio
We have hardware support for capturing and playing back audio information, but hardly any software support. Something like the current audio message system ought to be a very small project.
(P21)User access to full bandwidth of networks
As in (P17), the system should not obstruct the programmer’s access to the machine’s full I/O bandwidth in experimental situations.
2.5. Other
(X1)Adequate reference documentation
Reference documentation must be complete and reasonably well organized and indexed. The Inter-lisp manual is a shining example of how well an entire environment can be documented. It also demonstrates that keeping this documentation up to date is a lot of work.
(X2)‘‘Efficient’’ interface for experts
For experts, the desire for common operations to require a minimum of human effort often rightly takes precedence over the desire for the greatest possible uniformity or simplicity in the human interface. However, such interfaces are too often constructed without paying any attention to the few principles of interface design that we do know. We believe it is important to consider consciously the design of certain key command interfaces (editing, debugging, screen management).
(X3)Uniformity in command interface
In the process of using interactive programs, such as the tools listed in the previous major section, the user will inevitably accumulate perceptions, opin-ions, and models of the programs, and conjectures as to their workings. The net effect of these percep-tions is referred to as the user illusion. The intent is to allow the user to see the programs only in relation to his own needs and purposes, and not have to concern himself with the internal workings of the programs. What is important about a standard user interface package is that the user be able to confidently predict the general manner of interaction with a program that uses the package, even though he hasn’t experienced it yet; and that, by and large, the user will be right. This has been called the Law of Least Astonishment.
The concept of a consistent user interface also simplifies the design and coding of any program that interacts with the user. By adopting the conventions and making use of the facilities, it is a relatively simple matter to create useful interactive programs, because the programmer can concern himself with the algorithm rather than with creating his own user interface.
We believe that, in addition to consistency, another important goal to pursue in the design of the user interface might be called the Principle of Non-Preemption. Individual interactive programs should operate in a non-intrusive manner with respect to the user’s activities. The system does not usurp the attention and prerogatives of the user. A program responds to the user’s stimuli, but then quietly retains its context and logical state until the user elects to interact with the program again, not (for example) monopolizing the resources of the computer.
(X4)‘‘Self-teaching’’ interface for beginners
(X5)Good introductory documentation
Since we are concerned with a programming environment primarily for CSL, and secondarily for the rest of the local research community, we feel that concern for novices should have low priority, since the rate at which new people join the community is low and most of them are already sophisticated.