|The Gift From The Machine|
[Annotations added, July 1998, in maroon color, in square brackets, like this. Also, I wish to thank Tom Gee for having converted the original version of this essay, which resided on an IBM S/370 MVS mainframe computer system, to an ASCII file on a PC floppy disk, by running a printout through an optical character reader, and then manually "keying" some of the text. I wrote this essay almost entirely before I "returned to school" in September, 1983; consequently, it reflects almost nothing of my over 10 years graduate studies.
(The size of this web page is 176k.)]
|**||TWO WAYS TO LOOK AT THE MATTER|
|**||TWO WAYS TO DO THE JOB|
|**||THE PROCESS AND THE PRODUCT|
|**||SCIENCE AND PROGRAMMERS|
|**||WHAT IS AT STAKE|
|**||APPENDIX: BUILDING JERUSALEM (THE SOCIAL ORGANIZATION OF PROGRAMMING)|
|**||APPENDIX: HOW THE SOCIAL SCIENCES CONTRIBUTE TO THE PROBLEM|
|**||MAN'S TASK OF KNOWING|
|**||ACKNOWLEDGEMENTS and EPILOG|
|Leisure is the basis of culture.|
Go to website Table of Contents.
Return to Brad McCormick's home page.
Return to site map.
"The way is everything. The end is nothing." ---Willa Cather
"Not yet and yet already." ---Hermann Broch
"Man speaks... The words speak." ---Martin Heidegger
"Let X=X. You know, it could be you." ---Laurie AndersonThis document sets forth a vision of the constructive potential of technology for people. It focuses on the relation between the computer programmer and his work, because computer programming work[, in parallel with ongoing self-directed reading,] taught me what I here share with you.
A familiar notion holds that the purpose of technological work is to produce useful products. A corollary is that the computer programmer's job is to write programs that satisfy user needs. This view is not literally incorrect in the simplistic sense that its contrary (to not produce products or to produce useless products) would be better. It is, however, inadequate, and unfortunate, because it cannot recognize the constructive potential, beyond products, of programming (technology in general), for the technological worker (programmer) and for society. Acceptance of the customary view causes people to fail to see the potential and to miss opportunities for realizing it. It also lets endeavors to shape the social activity of programming into something that no longer has the potential masquerade as progress and gain wide acceptance.
My view is that computer programming is an adventure an individual undertakes into understanding, self-understanding, and remaking the factual into the desirable. While few articulate such a definition as a description of their motivation, an access of light does, at least at first and implicitly, inspire many to become programmers, if only in the form of a naive fascination that: "This is like solving puzzles, only for real." People become programmers not just for a paycheck, but because the work promises personal growth ("solving puzzles") and public accomplishment ("for real"). That is a good beginning. But what a programmer ends up doing on the job often has little to do with the reasons he became one. Over time, many 'succeed' in putting it out of their minds and ceasing to expect too much. The promise of the good beginning gets lost, because its significance is not appreciated. ---After all, a programmer's job is to write programs that meet user needs, not to have fun. Let's replace the word "fun" in that sentence by "insight", and say it again: After all, a programmer's job is to write programs that meet user needs, not to have insight. (What could be more 'fun' than a new insight?) The promise of the good beginning that gets lost is the promise of insight, thru which alone the networks of significances that constitute the individual programmer, society (the 'user'), and universal culture come into existence and endure.
I do not speak for all times and places. I speak for opportunities that languish among us. The fate of mankind becomes daily more dependent upon computers. Unless we regress to a "new Dark Age", there are going to be programmers for a long time. Operating systems and programming languages will be written by somebody. There is unrecognized potential in this work. And it is not only the pragmatic potential that more creative programmers will build computer systems that better serve human needs (e.g., the optimization of global food production and distribution). It is also something (I, at least, find) wonderful: That the programmers' experience of their work --- what they do and how they do it and what doing it does for them --- may give meaning to their lives, and that, thereby, they may become examples, so that the people whose needs the systems satisfy may be inspired by the programmers' lives to discover their own reasons to live. Many persons today have their needs satisfied, but remain hungry for a reason to live.
The progeny of Nietzsche's "last man" prosper, some as managers and programmers. Short-sighted and often opportunistic exploitation of technology devastates the earth and trivializes our lives. Nonetheless, in the computer programmer's daily work --- closer to him than the nearest fast-food stand ---, the unrecognized promise of every skill (the Greek "techne") as a way of access to insight abides. This is "the gift from the machine". For the individual, it is the potential for meaning in his life. For society, the potential for renaissance. For universal culture, the potential that universal culture become universal among humans. It depends on person's awareness of it in order to come into being, to develop and endure. Only in our thinking about the promise and sharing our thoughts with one another can the promise be fulfilled.
Programming, like anything that occupies a person's time, is also part of a life. To appreciate what that means requires a change of perspective: Instead of looking at what people do from the outside, as the 'behavior' of 'individuals' and 'groups', we need to consider human existence as it is lived, 'from within'.
Something is missing from this description: the different cargo-handlers' experiences. A machine does not have experiences. If its mechanism operates entirely automatically and entirely faultlessly, no one need be aware that the process it performs takes place. When people handle cargo, however, there is at least one person who is aware of what happens: he who carries the box. THE HUMAN CARGO HANDLER EXPERIENCES HIS WORK. Both man and machine accomplish the same objective result, but only the man is aware of it. He has an 'inside'. My purpose here is not to enquire deeply into the lived experience of moving boxes around, but to show that any human occupation can be viewed in two mutually exclusive ways:
If you look at programming from the inside, if you open yourself to the programmer's hopes and fears and to the effects doing programming has on his life, if you share his experience, then what the programmer does may still affect your balance sheet, but now it will also affect you. You will feel joy in a programming insight --- finding a "bug", or contemplating an elegant algorithm ---, and not just profit from its results. You will suffer when you see bad code, not just because of what it may cost you, but because it's something ugly in your world, and because you realize writing it was probably a mind-dulling experience in some programmer's life [, and maintaining it continues to be a mind-dulling experience in some other programmers' lives].
Is the one way of looking at the matter right and the other wrong? I know which way I want to answer that question, but I won't. For it is not a matter of being right (either morally or factually), but of being (living) together. If I am a programmer, I know what the one person thinks of me and my work, and I know what the other person thinks. Beyond "being right" is what we feel (or don't feel) for one another.
The "inside" question in regard to programming (or any other human activity) is: What is it really? What can it be? What is its promise to us? How can it "turn us on"? Example: I can hold a football in my hands, run around with it, throw it, or kick it somewhere. But anybody who knows and cares about football will protest: "That's not football, not really. Football is what John Unitas did." ---I ask "What is programming?", not to find the "right" answer to a question, but to discover the promise of an activity, for myself and my (present and potential) friends.
Programming is --- is 'really' --- hard work, and it's more fun than anything. It is frustrating to the point of total disorientation, and it is a source of profound peace and joy. In other words, programming is one of the domains in which the human spirit can find a broad field for the free exercise of a wide range of its potentialities. Programming, then, is like physics or poetry or music or architecture. It is an art or a science or a craft, whatever word pleases one.
Is programming really an activity with the kind creative breadth and depth poetry, for example, has? Clearly, what many programmers do much of the time is not. Equally clearly, almost everybody is 'literate' today, and most people write words every day, and that is not 'writing'. Writing --- the poet's art --- nonetheless occurs sometimes. As far as programming is concerned, some times, in working with a line of code, I have seen something. I have seen what the line of code meant in the program, and what 'seeing what the line of code meant in the program' meant as an event in and of Being. To describe such an event as "an experience of beauty, wonder and creative insight" risks trivializing it (because these words have been trivialized in current usage). To describe it as a direct personal encounter with the destiny of the world --- though not untrue --- risks sounding absurd (because our lives have been trivialized in current usage).
But 'programming' is more even than this. It sometimes is separable activity (e.g., writing operating systems and compilers) distinguished from other activities like physics and poetry. Programming can also become part of those other activities, transforming them from whatever they used to be in ways such that one of the things their practitioners do henceforth is to program. The activities become 'programmable'.
The generation of algorithmic procedures is a human capability that, until the advent of computers, was stymied by lack of engines that could execute algorithms efficiently, and that could easily and quickly be re-instructed to execute modified or different algorithms. This deficiency constrained the direction of human intellectual activity in general: The ways man has been able to do things have been restricted to ones that did not require bulk algorithmic processing (albeit, sometimes this restriction has been loosened for some persons by their turning other persons into algorithmic processors). Stated the other way around: Ways we could do things before we had computers were only weakly informed by our capacity for generating algorithmic procedures, and that capacity, in its turn, being exercised under highly constricted conditions, itself developed slowly.
Programming, understood in this broader sense, gives everyone ("users") the power to use algorithmic procedures. Writing programs fascinates many, once they gain some familiarity with the computer. Programming encourages the search for generalized solutions to problems, and, in consequence, may even stimulate reflection on what one does. This does not mean that every doctor and writer will become an assembler language programmer. In many cases, the 'programs' these persons write are not in any 'programming language', but rather are inputs to an application. What matters is that the interface between user and computer, beyond enabling the user to do this and then do that (which is what he always was able to do), empower him to define macro procedures that execute alternative actions based on condition tests, do things iteratively or recursively, etc. The results of this 'new way of thinking' will probably be revolutionary for each field into which programming integrates itself in this way, and all the joys and frustrations of programmers will be shared by programmer-doctors, programmer-writers, etc.
We who write programs are building the world in which all of us, programmers and non-programmers, will live. What opportunities for creativity and joy can we build into it? Can we make our programs models of lucidity, so that, when persons read the code or use it, the see by immediately present exemplar that man --- and, by extension, they themselves --- can produce things that are good?
Nowhere is the human spirit so little constrained by the accidents of given conditions as in the construction of logical systems. If these are not paragons, not only of clarity and intelligence, but also of joy and hope, what hope is there for the other areas of life, where the degree of human freedom is necessarily less? I write programs to participate in the building of universal culture, which I call "the network of significances", and which others have called "the city of man" or "The City of God."
Beyond needs and their satisfaction, beyond 'economics', the human meaning of work, the reason to do something even when one does not 'have to', is generosity, the spontaneity of venturing forth to seek, and, one hopes, to greet and make friends with what is other than oneself. Our work can open us to receive the other, and, when he comes, what we have made can be an offering to him, the basis of hospitality. I write programs as a way to approach those to whom I would go, and as a way to welcome those who would come to me: What I see is what I have to give to you.
Programming does not have to realize this vision, and programmers and managers often don't even try. People can become programmers because it is safer than teaching in a public school and pays better. The programming workplace can be just another white-color office. The work can be done in the perfunctory manner. The product may have no higher aim than to bring in revenue. People choose what programming will be. And some people choose what it will be for other people. An individual programmer generally has some choice how he relates to code: he can strive for insight, or merely apply over and over the current clichés of the craft. He usually has less control over the conditions of his work and what he works on. Managers, and co-workers, most often determine these things. Even with the best intentions to do right by code, a programmer can be worn down by the forced option of working in a bad environment on projects that ought not be done, or not working.
Creative activity, on the other hand, seeks what is so far unknown to us. Activity is creative when: (1) we are uncertain what to do, or (2) we do not know how to do it. Consider trying to solve a puzzle: If you resist the temptation to look up the answer, often there is nothing you can do to help yourself figure it out. The answer may be 'staring you in the face', but you 'can't see it'. If you do look up the answer, it may be a further puzzle to figure it out. The solution, when and if it comes, comes from 'out of the blue', when it wants to, and no effort of will can make it happen. Because 'something is missing' --- be it (part) of the answer, or (part of) the question ---, creative activity can not be done to schedule or on demand, and no one can be held accountable for accomplishing it. (Inventions cannot be scheduled. [The preceding sentence derives from a conversation I overheard in IBM, between two business planners, ca. 1978-9. The one said to the other: "Fishkill is not coming in with the inventions on schedule." The two were not happy.])
If we wish to create sophisticated, 'state of the art' computer systems, on the other hand, we do not have this moral luxury. Developing a new operating system (e.g.) requires inventions. Before we start, we are unsure how to do much of it. What we think we know often produces unexpected consequences (which is an oblique way of saying we didn't know after all). And parts we do know some way of doing often could be done a much better but unimagined way. Even 'worse', we may not be entirely certain what the system should do. Our only option is whether to try to force the project into the mold of something it is not. We can treat it as if it was a non-creative endeavor. We can pretend we know what we are doing when we do not know, and construct elaborate PERT-chart Potemkin villages: plans that change and schedules that slip. Alternatively, we can face the fact that the project is creative activity. We can admit we are not sure that we can do it, much less when. Once we concede the outcome is beyond our control, we find ourselves constrained to treat the project, and the persons who work on it, with respect, and 'tender loving care', in the hope that may help them bring it in.
'Improved programming technologies' may improve program quality where programmers have a poorly developed sense of structure and style and write 'spaghetti' code. They can also generate Hawthorne-effect transients. Anything, even writing a few block-structured GOTO-less programs [or looking for staples in their office carpet, etc.], may genuinely benefit a programmer, if the experience encourages him to think about what he does. "Improved programming technologies' would be OK as optional selections on a universal smorgasbord of programming techniques. In the real world, where programmers are employees, it's not that way. Managers forcibly impose 'improved programming technologies', to control costs, not to stimulate thought.
I would like to find a "structured program" that inspires wonder and joy in me. The 'structured programs' I have seen have been uninspiring. There is, however, another sense in which one can speak of structure in programming: not the logical structure of code, but the social structure of the programming process. This is the kind of "structure' that seems to be the real concern of the 'improved programming technologies' movement: to cause programmers to produce code thru structured activities performed in a structured environment. My disappointment in the 'structured programs' was nothing compared to my shock at the social context in which I found them being produced. It sometimes reminded me of a religious cult. The 'born again' programmers are intolerant --- not only of GOTOs, but of any programmer who will not submit to their depersonalized (they call it "egoless") regimen of top-down design and walkthrus; their disciples know nothing else. I saw how 'improved programming technologies' can make each programmer his own 'eye watching over his shoulder'. Some people may lack an ego and/or get pleasure from ferreting out their neighbors' heterodoxies. For the rest, a prudent person purges the GOTOs and the 'ego' from his code before coworkers have a chance to find him out at the walkthru. The 'best' programs have no bugs and few new ideas ("...the best surprise is no surprise"). Their 'merit' is that they can be maintained by any new-programmer training class graduate. ---All this contributes little to the improvement of program structure, but it brings a lot of 'structure' into programmers' daily activities, which thenceforth can neither be science nor art. [(Of course, if the program wasn't worth writing in the first place, then it should at least be maintainable by any programmer trainee -- or, even better: programming manager --, so that persons' lives do not continue to be wasted any more than necessary in maintaining it.)]
Programming can produce miracles of loaves and fishes. Unlike material goods, a program, once written, is written for everybody and forever; it does not need an assembly line to re-produce millions of clones. Programs do not wear out. ("Program maintenance" is a misnomer. What is called that is really the creation of new programs that happen to have lots of lines of code in common with old programs, so that the development cycle can be shortcutted by cannibalizing code from the old for the new.) New programs --- and every program is new each time a person learns something from it --- are new ideas. Every time a person 'changes' a program, he has an opportunity to think, i.e., to make not just a different program, but a better one. (If there is a fruitful analogy between 'program maintenance' and anything in the automotive realm, the analog is the custom shop, not the service station or repair shop. A program that gets 'maintenance' done on it often does exactly what it was supposed to do, but somebody wants it to do something else.)
A programmer has been productive in a sort of negative way when his product reduces the amount of drudge work people do. He has been positively productive when the product gives people opportunities to do interesting, instructive, enriching, ennobling things. A programmer has been productive when his work gives people happy surprises ("miracles", in the etymological sense of good things to wonder at), when people, himself included, are caught short by the good things they find in the code, so that they stop and marvel: "How elegant that is! How direct! How powerful! Such a delight! How did anyone ever think of that?". A programmer has been truly productive when, by a process of contagion, his work does good things for an ever widening circle of people: Starting with the initial design and coding, for himself. Next, for other programmers who work on the code with or after him. Then, for the various users of the system. Finally, for people who may never even know the programmer did what he did, thru the way the new ideas and ennobled feelings persons get from contact with his code change how they deal with others in both their work and 'private' lives.
This "end product" mentality presumes we already know what to do, assigns all creative energy to scheming how to do it more efficiently, and ignores the unknown. It does not solicit advice from programmers; it requisitions their manpower. The role of the 'improved programming technologies' is to help mold an efficient labor force, which, if "thinking" is understood to mean transforming inputs into outputs by logical/mental processes, does nothing but "think", but, if "thinking" is understood as questioning and searching, does not think at all. The programmers get procedures and tools that increase the efficiency with which they do atomized, routinized tasks they already know how to do or can quickly learn, and that hinder them from doing (or even imagining) other things.
If the proponents of 'improved programming technologies' fail to realize their dream of programming by clerks, many persons will have been cheated along the way out of good things they didn't know they were missing, and the final debacle --- information blackouts in megalopolis --- may threaten millions of us with starvation. If their dream does come true, the losses may be even worse: Then programmers will have opportunities to experience more of the same thing they already experience every day --- what at the end of their work shift some of them in a varying mixture of despair and resentment call: "another day" ("over the hump", "TGIF", etc.) --- forever.
[Another way to put the issue here is: The worker is the consumer; the producer is the user. Both sides of each dichotomy are the same persons, just in different roles (different hours of their days: "work-time" versus "leisure time", etc...). If we rob Peter to pay Paul -- if we cater to the user's desire for a bargain by squeezing the producer (e.g., lowering wages and lengthening working hours, to lower prices), the logical result will be great bargains for persons who are too ground down, broken and even dead, to be able to benefit from anything. The challenge is to heal the "splitting": to make work intrinsically rewarding and nourishing as well as productive, and "non-work" intrinsically productive as well as refreshing. What is needed is functional optimization of the whole "package", not hypertrophic maximization of particular subassemblies without measuring them by their effects in the overall system and process.]
Eventually, the strains rupture, and history demonstrates the limitations of the current ideals, usually in the form of catastrophes that hurt a lot of people. Mankind (although not always the same men) has some new insights, locks onto these, goes a little further, and gets stuck again.... We have an opportunity to find out what our special catastrophe is. We can try to go back. (But, if we got to here from there, then going back may just start us back on the road to here.) Alternatively, we can try to do something different: to unstick ourselves, permanently, by cultivating insight as a good in and for itself, not just for what it may be 'good for'.
Unceasing vigilance is needed from every person, user and programmer alike, to question every thing, especially what he has been told he 'ought' to do, because (as the origin of the phrase "killing of the dream" suggests) programmers are not the only people in danger. Far more important than what people have learned is that they can learn. Where programming is thinking, and thinking is commitment to insight, programming is a hope. Then programmers become 'smoke detectors' or forest rangers of the cultural realm, consciences of their time and not just sub-assemblies of it. (Jacob Bronowski called this hope: "the democracy of the intellect".) A programmer can start by questioning the line of code in front of him, and his relation to it[, including his working conditions, etc.].
There is another way to work, where the programmer finds an inner stillness, and becomes 'centered' in himself and attuned to his situation. When he sees what to do, he pro-duces (etymologically: leads forth, brings to presence), with only a few lines of code, something that was previously unknown, and therefore could not have been accomplished by any magnitude of effort. Real productivity gains come when free individuals attain this relaxed awareness that produces amazing results 'effortlessly' ("Zen mind").
The programmer needs to make a personal commitment to code: to cause projects that deserve to be done to be undertaken, and to prevent from being undertaken projects that ought not be done. If he undertakes a project, a programmer needs to commit himself to have the work done right, and that means: better than could have been planned. If, as the work progresses, what at first looked good ceases to look good, then the programmer needs to commit himself to stopping it. How a programmer works, whether he uses flowcharts or pseudocode or designs by coding (which may look as if he was not designing at all), whether he uses GOTOs or not, whether he codes in assembly language or a higher-level language, etc., is a private affair between himself and the code, except insofar as the final result can be improved by someone competent to judge it [1998 note: Today I would be less one-sidedly brash about this last point. I wrote it in reaction to the meaning-evacuating intrusion of what we might call "re-engineers" and others into the work process. That concern is valid. But it is also clear that "mission critical" systems need to be transparently auditable (etc.), and that, e.g., an air traffic control system which works perfectly but which nobody can understand, is not good enough. One key desideratum here is to reduce the extent of "critical" missions, and the "critical" parts of the ones that cannot be eliminated altogether, to a minimum.]. Always, programmers need time and friends.
Programming managers sometimes try to 'protect the world' (or at least themselves) from the consequences of incompetent programming by placing blanket restrictions on what programmers (competent and incompetent alike) can do. That may be easy; it also assures that the best job will not be done. Alternatively, programming managers can accept the more difficult task of distinguishing the competent from the incompetent, and distinguishing those with vision from those who lack it. The programmers who have vision must be trusted to use the facilities they ask for to do what they want to do: write (only) good programs [1998 note: They should also be expected to provide an accounting in depth for what they do and don't do, in part to help justify the trust placed in them, and in other part to help educate those to whom they render the accounting to be able to evaluate the programmers' work and their accounting of it competently]. The programmers who are competent but lack vision can work under the inspiration of the ones who "have it". Programming managers should attempt to guide individuals who are not competent onto the path of their more fortunate fellows. Where this effort proves futile, where a 'programmer' lacks the ability or the character or the desire to become a worthy practitioner of the craft of programming, an attempt should be made to help him recognize what he lacks; but, whether he understands or not, after all else fails, if the man will neither leave of his own accord nor learn, then his manager should effect his removal from programming[, and secure for the person a more felicitous social role]. (There may be some merit in the analogy between having a new idea and having a baby, at least to the extent of suggesting that a new idea is something that happens 'in' a programmer, but not under his direct willful control, and that programming management can be a kind of midwifery.)
Our mission, should we choose to accept it, is not just to transform given object configurations into different object configurations (although we must also do that). Our mission is to transfigure whatever we happen to find from mere existence to goodness. Programming, like any creative activity, can be an adventure in the relation between individuals and their work, an adventure in giving and receiving, an adventure in grace.
When a scientist appears to be applying scientific law, what he is really doing is stress-testing it. A scientist is a person looking for trouble. He chooses to do an experiment not primarily because he hopes it will produce some useful result, but to find out whether received scientific beliefs can survive the proposed experiment. Hi-energy particle physics furnishes a graphic example of this. Its institutional respectability should not obscure is subversive intent: to break open the fragments that were produced by previous experiments, and see what's there. The title of Freeman Dyson's book says it well: DISTURBING THE UNIVERSE.
Many of the things scientists do appear to fit the popular stereotype of science as incremental accumulation of knowledge, and much scientific investigation doesn't seem to disturb anything. But science exists only in the lived experience of scientists, in each new day's experiments, insofar as these call the known to an accounting of itself always one more time. Ordinary science, the experiments that 'do not disturb anything', thus serve two functions: First, by the very fact that they are experiments, which means that they interrogate the given, they keep the spirit of science alive in the lives of the persons who do them (they are science's 'morning and evening prayers', and the insights --- what I call the "minor miracles" --- a scientist has doing them can give satisfaction that sustains life-long commitment). Second, because, in whatever little way, they push given scientific theory into new areas or re-validate results already presumed to be won, ordinary experiments provide occasions for the theory to break down, i.e., they give extraordinary science opportunities to happen.
Extra-ordinary science happens where scientific explanation fails and science gets stuck. Such a breakdown can occur in a given conceptual order only when that order encounters a phenomenon it both recognizes and cannot assimilate. Since ideas do not think themselves, this implies that persons (an isolated individual or an entire scientific community) must have articulated the given ideas to the point where they got into trouble. Then the people had to pursue the emergent anomoly to determine it was symptomatic of fundamental conceptual inadequacy, and not merely that they had not worked hard enough to explain it. (Most problems turn out to be of the latter variety; the existing conceptual order is able to assimilate the new data after all, and, having vindicated itself thru the ordeal, it emerges with its stature as 'unshakable' truth enhanced.)
Once in a while, thought reaches a dead end. Then, the scientist finds himself in an uncanny place where there is no one and no thing he can make use of (least of all can known scientific law or methodology help, because they are the very things that landed him there!). Try as he may to figure out what is happening, he probably realizes that to seek what he is looking for will not likely help him find it, except 'by accident'. Even if he comes across the answer, he may not recognize it. He does best to relax, 'let go', and await what can only come --- if it ever deigns to come, which it may not --- from the unreachable 'elsewhere' the unknown is. If the new does give itself to the scientist, no particular new fact need be discovered; yet (the meaning of) every thing changes. A new world ousts and replaces the hitherto world as 'the world'.
This decision brings about a transfiguration. The results of a scientist's work may have much practical value. And he does not disdain doing something just because it is 'useful'. The scientist remains a mortal on earth, with all the needs of any other person. What does change is the relation between the scientist and insight: he no longer identifies himself with the particular entitles insight happens to present to him ('church', 'country', 'company'...), but with insight itself --- the event by which there are entities (of whatever kind). All what-is are truths. Truths are 'seen'. Objects (including 'profits', 'values', 'God', etc.) require insight. Etymologically, the word "holy" means whole, healthy, entire. The event of insight --- human seeing and saying (e.g., a scientist doing an experiment or a programmer writing a program) ---, not any thing seen (such as the published results of an experiment, or the finished program) is whole, healthy, entire, i.e., holy. (Published results or finished programs, in their turn, can participate in wholeness ('holiness'), by finding a place in further creative endeavors.)
As a secular version of predestination, 'scientific living' is a self-contradiction that cannot be thought, much less acted upon, although some people (B.F. Skinner?) 'think' the contrary and 'act accordingly'. A variant makes 'scientific living' a secular version of original sin. Then the claim is that science tells people what they "ought" to do (although the people are free to do something else, be they so foolish --- and, if they are that foolish, then perhaps they should be prevented from doing what they want to do, 'for their own good', as well as ours...). The plausibility of this notion may be due to a seemingly obvious extrapolation from the fact that results of science often suggest ways we can go about trying to achieve what we want to accomplish once we have decided what the latter is. (If we want to be healthy, scientific knowledge can help us plan a healthful diet.)
But neither science nor anything/anybody else can tell a person what he "ought" to do (whatever the word "ought" means), because any such prescription would itself be judged according to the person's then current value structure ("is does not imply ought"). A person does what he does, in the end, because he "wants to", i.e., because it appeals to him. After persons decide what they want at one level of detail, science still cannot tell them what to do at the next lower level, because, in general, there are a number of different ways (more than one imagines...) to do any particular thing. Science can dictate neither ends nor means.
In one form or another, 'scientific living' is an "idee fixe" for twentieth century Western man. Behaviorist psychologists and pedagogs, "modern" architects, Marxist revolutionaries, and proponents of structured programming, to name but a few, justify what they have chosen to do, not as what seems desirable to them (and which, as a personal judgement, might seem less desirable to other persons), but as "scientific" (and, therefore, obligatorily binding on every person, especially other persons, whether the latter 'subjectively' like it or not).
But science --- the lived experience of a person doing scientific investigation --- is not value-free. Religious fundamentalists are right in their suspicion that doing science promotes a kind of life that is different from the kind of life they wish to promote. Doing science is an act of faith, like anything else people do. It is a person's testament (in deeds, not just words) how he wants to spend his life and what kind of place he wants the world to be. In one case, science may be an act of love in which a person humbly approaches a thing, and listens to what it may have to say to him (Gregor Mendel and his peas?). Then, science embodies the "value" of a world where people respect the earth and each other. In another case, science may attempt to extort and/or exploit nature's secrets (the race to find the structure of DNA? Edward Teller and the H-Bomb?). Then science embodies the "value" of a competitive world where people live in mutual suspicion and the earth is ravaged. In general, science affirms (the 'value' of) human curiosity and questioning authority (a character in Bertolt Brecht's play GALILEO says: once the people learn the sun does not revolve around the earth, they may decide to stop 'revolving' around their rulers). Science, far from being value-free, is always the partial realization of somebody's values.
Doing science is not something apart from life. It is part of life, and, for some persons, their way of life. Though science and its results cannot tell people what to do, it may, some times, inspire them.
Beyond their results, the promise of science and technology is the activity: THAT PEOPLE MAY BECOME SCIENTISTS (PROGRAMMERS, ETC.), AND THEREBY STEP THRU A DOOR THAT LEADS, ONCE AND FOR ALL, OUT OF ENDLESS REPETITION OF THE CLOSED CIRCLE OF 'DAILY LIFE', AND OPENS ONTO AN INFINITE ADVENTURE. Our knowledge, and our machines and their products, can give us, joined together as friends, or, if that cannot be, then alone as individuals, opportunities to experience the joy of insight in our daily lives. When we recognize and say "Yes!" to these opportunities, they become paths by which each of us may enter into the event --- the miracle --- that what-is, whatever it is, realizes that and what it is, and, no longer merely determined by its past, begins to create its future, begins its ascent from fact to art, begins to rise from being there to being desirable. A programmer, typing a line of code on his terminal, may enter immensity. --- Science, as Einstein observed, is intuitive activity, not something 'coldly objective'. It is a promise of creation, of joy and play, 'from 9 to 5' (but who would stop there?).
There is a facile sense in which the scientist is subject to scientific law: Even Galileo probably could not make his body fall up. There is a deep sense in which scientific law is subject to the scientist: Galileo not only produced new laws of physical motion; he altered the meaning of the very notion of 'law' in science. I imagine Galileo did physics, and I know I write programs, not to suffer necessity, but to participate in freedom. Because the laws of science are human productions, what a scientist really probes, when he does an experiment, is (the human) spirit. If a physicist thinks he is disturbing only 'the universe', he falls short of all he can do, and fails to understand who he can be. Physicist and programmer alike, when fully conscious of what they have chosen to do with their lives, know they are disturbing themselves (and their neighbors and the universe and whatever else there is...): A free man is a judge of the world.
For us and in our time, however, the phenomenon assumes a new form: an all-consuming feedback loop commanded by the so-called "social" sciences. This is an ultimate danger. It is especially insidious, because it dissimulates. It kills the dream of the human adventure in the very act of promoting education and health. The danger is not addressed but may be exacerbated by the customary efforts to achieve global disarmament, world government and an end to poverty, and it grows on both sides of the 'Iron Curtain':
Our only way safe of the black hole is to cultivate reflective thinking. For one thing, as Heidegger says, we must think about the danger. But reflective thinking is not one more thing we need to do in addition to everything else we already do. REFLECTIVE THINKING IS A DIFFERENT WAY OF DOING WHATEVER WE DO. And, not least, it is a different way of doing our work, which, for more and more of us, is technological work (e.g., computer programming). It is the way of the individual --- who must in each case be oneself and not anyone else --- whose work is a commitment to insight, a search that is at once personally experienced and universally intended. It does not just make use of technology. It is an intimacy where the person enters into the technology, and takes the technology into himself. It is a way of receiving and giving, of seeing and saying (and listening and waiting), and it penetrates ever farther into a realm of high things that exist only insofar as they are thought. Yet it is accessible to us, and may even be the way of an unknown programmer coding a computer instruction, a programmer receiving the gift from the machine.
I would base the social organization of computer programming on simple relations of giving and receiving. There is also a natural division of labor. One person wants to design a generalized external command language interface in Backus Naur Form. Another wants to write the mainline driver logic as a table-driven finite state machine. To gain experience, apprentices beg to be allowed to do routine tasks master craftsmen would do only from commitment to getting the job done. There may be some tasks nobody wants to do; the faith the programmers have in their guiding ideal, and their commitment to [and support of] each other, must give them the incentive to do even these things well (and the grace to receive satisfaction from doing them).
Building good programming systems depends critically on two things:
How should one go about organizing a large programming project? Believe in it. Not just in its profitability or its utility, but in it. (The project should have a use, and if it has a use it will probably find a market. But, often, one selects one project to do from among a number of different projects one might have done, and, even when a particular project "must" be done, there are always dramatically different ways to do it. Ask yourself: "How can I do this thing such that I would still do it even if I did not 'have to do it to earn a living'?") Experience the project deeply (so that it becomes a place where you really feel "at home"). Invite your friends to join in: "Look at this neat thing! Let's do it together!" And have humility (...we may fail).
For us: Meaning resides in the very event of seeing (insight), not in any thing seen (object). What is truly wonderful is not any wonderful thing, but WONDER. The fulfillment of meaning is in us, when we are where the event of insight occurs, not in another world to which our entry is uncertain and in any case only in the future.
Such an understanding raises the goal of technological work from "making profits' or 'satisfying social needs' (both of which, like the Hindu wheel of rebirth, are without end and without issue), to the quest for insight in the creation of things of enduring value. Commissioned to make something important, to figure out how to make it right, and to understand both the undertaking and its results (which task of understanding is why the undertaking is important), the worker may find high satisfaction in his work. Because he not only makes something, but learns something, and communicates what he has learned, he advances truth in his time, and thus, though his name be unknown, he makes a real difference in universal history. Beyond the spectacle of 'famous people' and 'big events', he makes the only difference that is a difference (since all differences are truths in their time, i.e., living events of insight). Ideas are ideals. Meaning (signification) is Meaning (purpose). Knowing is the miracle. Good work (among good friends) is 'heaven'. And we may see face-to-face.
That we no longer believe in the existence of the object 'heaven' is not cause for us to despair of enjoying the heavenly, and we need not envy those earlier men-of-faith's sense of the meaningfulness of life as something denied to us. We can take care making our computer programs (along with whatever else we do), not for "Our Father in Heaven", but for the 'heaven' the work can be for us, on this earth, here, now, in such places as scientific and technological laboratories (e.g., programming workplaces). The work done there can be (though, as we have seen, often is not) CONTINUOUS CREATION for the workers, that has a natural consequence progressively to free the rest of mankind for CONTINUOUS CREATION, instead of most having to spend their lives in repetitive, stupifying labor to sustain biological existence. Creative work, if we undertake is as caring commitment (to its subject matter and ourselves), can transfigure our lives. And we may reasonably expect this inward grace to manifest itself in outward and visible signs --- ' products' --- that, in addition to satisfying present needs, will be living sources of inspiration and joy, not only for ourselves, but also for people hundreds of years in the future (or intelligent beings from other worlds) for whom they will be of no practical use, as the beauty of the cathedrals of the Gothic era still shines forth for us, even though we no longer share the faith of their builders.
"The city is the place of availabilities. It is the place where a small boy, as he walks through it, may see something that will tell him what he wants to do his whole life."Need is universal. No matter where he grows up, a boy will enter some occupation, probably his father's, and toil to sustain his own life and his family. In a city, on the other hand, as Kahn's words tell us, the variety of patterns of meaning overflows in the shared daily life of free individuals. There, a boy may find something he wants to do. (Desire, over-flowing need, truly is 'super-fluous'!)
What makes a city be a city is not the packing density of its inhabitants, but their individual commitments, beyond economic considerations, to their respective skills, and sharing their commitments with each other. A city is where seekers after truth (insight) build a common life dedicated to the search for truth together. No longer unwittingly bound in the rituals of their origins by the fatality of ethnicity ('roots'), their life opens to the infinite freedom of universal culture (the 'beyond' of availabilities). And a small boy, looking from the work of one master to that of the next --- architect, physicist, chef, computer programmer, etc. --- may find something he wants to do, the start of his own path into truth. ---The highest test of a city is not that it provides a place where those already committed to truth can work together, although that is important. Even higher is that, in the city, those who have not yet entered upon the path of the search for truth may be inspired to do so.
To found a city is not primarily to erect buildings, but to open social opportunities for people to meet truth. It is to make a place where the truth and the people find a home. If programming projects are organized as I have advocated, they are by their nature city-building activities. If politics is the life of the city ('polis'), then the social activity of developing a large computer system is basic political activity. All that remains is to fit the ancillary activities of life into the framework it defines. We can imagine a community organized to develop a general purpose computer architecture and its hardware and software implementation: Its mission would be to provide the best possible computer system to the various communities that need computers (their missions in turn would be to produce the best possible...). Required is a physical workplace for the programmers and engineers. Thus architects, carpenters and masons enter into the life of the community. The people must have places to sleep. More opportunities for the construction professionals. Everybody has to eat. Enter chefs and their apprentices. And so on....
How is this different from a 'company town', with its odious connotations? Here each person is offered not merely a chance to earn a living at his trade: He is also chartered to extend his craft in the interest of enhancing the mission of the community, which is not just to make the best possible computers (although that in itself would be much!), but to make the best possible community for making better computers. No, the community's mission is more than even that: To make itself be the best possible community for making itself an even better community, where 'better' means that each individual sees more clearly, and shares his insight with others (that the community happens to make computers is a fact of its historical situation, the 'territory' in which it pursues its mission, not that mission itself, which is for each member of the community to see; if the community existed in some other place and time, or if its members were different, [or if the members of the community found something better to do than to make computers], it [i.e., the community] might make something else).
Every job in such a community is both a production and a research position, a useful social task and an adventure in insight. A cook's job is not only to stoke stomachs, but to serve food so good, and in such a delightful atmosphere, that the engineers and programmers (and the cooks themselves!), every time they eat, experience an example of excellence that inspires them to do what they do, whatever it is, better. Practice savoring soup is good practice for savoring computer code. The phrase "Building Jerusalem" refers to a vision of real communities, in the late 20th century, on both sides of the 'Iron Curtain', dedicated to the search for truth.
The last paragraph is utopian. Or is it? To the best of my knowledge, no test has been made whether, even in strictly economic terms, such communities would be more productive than our society with its offices where people don't live and its dormitory suburbs where people don't produce anything, our society in which people do their jobs only because they 'have to', and, whenever they have the choice, consume things instead of producing them. If we can bring it to pass that what persons really want to do in their lives is their work --- and this may be possible in technologically advanced societies, not just for a few technocrats, but also for many skilled craftsmen ---, they will produce more and consume less (the latter, in part, because of the transfer of much time from consumption to production). Overall product costs, gourmet meals included, might well be less than in the current system, after factoring in the added value of the increased productivity, and the diminished expenses for all the things people did not buy because they were working instead. Even if unit cost is not less, since what is produced is better (does more with less and lasts longer), each unit might still consume a smaller part of available productive capacity.
There are other things in life besides work. But my thoughts focus on work for several reasons:
Labor --- the stupifying routine of zek, assembly line operative, file clerk, structured programmer --- still curses many. (Creative) work can bless us. Thru it, we may receive what H. Broch called "the comfort upon earth". I hope for this. My hope is founded in certain experiences I have had of the kind alluded to above. It is supported by the (surely prolix, and prolifically flawed) theoretical expostulations in this paper. There may also be suggestive historical precedents, of which I wish to cite one. It is not an altogether comforting example, for it raises a question about the compatibility between truth and life. It is a non-trivial example from America's still not-too-distant past: The Manhattan Project at Los Alamos in the early 1940s that built the atomic bomb. It stands out both for the successful accomplishment of a difficult mission, and, in the memories of its participants, as "the high point in most of their lives". It also founded a city (or at least a town).
In my everyday life and work I seek insight as best I know how. Its gift may bless and individual even in the most unfavorable place and time (which is fortunate, since often an individual's situation is not favorable). The best times, however, are the ones where the gift expands to assume also a public character. I want to live and work in a 'programming Los Alamos', where the product is (e.g.) innovative educational software, not a bomb, and the project is not a military secret, but widely known: one among many heavenly cities lighting up a transfigured earth.
Quantification of a range of phenomena does not automatically lay the foundation for a genuine science. The social sciences' basic project is: to study people "as we would rabbits or chipmunks and discover how they behave under different patterns of environmental stimulation", and apply what is learned thereby to the administrative management of everybody's life. A science can be generated by such endeavor only if human behavior is necessarily a function of environmental stimulation. A tragedy can be generated by such endeavor if the endeavor itself motivates people to believe their behavior is a function of environmental stimulation, and to act accordingly, so that, by [unwitting] role playing, they deny themselves a different and richer form of existence[, without being aware they've missed anything].
But persons can also try to understand their situation and interact with it (try to be accountable instead of trying to act predictable). Social science investigation generally requires that its subjects not understand or be accountable for their real situation[, which is]: BEING WATCHED, because the social scientist wants to see how people behave in another situation that does not exist, namely, the one that would exist if the social scientist was not looking. Where the ideal of complete deception is unattainable, and the subjects have to be told they are being studied, the social scientist makes do as best he can, by misrepresenting the nature of the experiment, trying to distract or numb his subjects so that they forget they are under surveillance.... Social scientists claim to study man, but what they more often study is: people in a condition of benightedness contrived by the social scientist to make his 'science' possible. Subjects who refuse to cooperate in their deception (or who do not successfully deceive the researcher into thinking they were deceived when they were not...) are eliminated from the study (and, wherever possible, given "treatment" for their condition), while the thoughts and feelings of those who are successfully deceived are deprecated as 'merely subjective'.
These social 'sciences' are eager to study just about everything people do (or at least those things they can get grants to study, which, often, are activities of persons agencies of established social power wish to control). But they do not appreciate, nor are their objectifying methods capable of appreciating (because appreciation comes from participation and reflection, not from observation and tabulation) life as it is lived 'from within', especially its higher forms: creative insight, philosophical discourse, contemplation of beauty, grace, forgiveness, "love, ...keeper of warm lights and all-night vigil in the soft face of a girl." By treating people as something less than who they can be, however, social scientists can help bring it to pass that that "something less" is what the people become. The people might opt for something different, if they had a choice (which often they are denied in the name of the social scientists' 'objective' findings, findings that, in the end, are 'object-ive' only in the sense that they reduce persons to nothing-but objects). (There may be a legitimate place for studies where a person hides behind a one-way mirror and observes others. But I would approach each such study with trepidation, and, as soon as I had completed my observations, I would come out in front of the mirror, and say to the persons I had observed: "I watched you. This is what I saw. It is yours, not mine. If you want, maybe we can work together to learn something from it. I would like to do that." First, however, I would say to them: "Forgive me.")
To undertake this kind of social(b)l(e) science is no less an act of faith --- a personal choice --- than to do the other thing. But I like it better (you may, too). In it, observation of people has at most a subsidiary place. (When, on occasion, the observational relation does occur, it becomes labile: from one moment to the next, the observer becomes the observed, and the observed the observer, and each observes himself equally as he observes others.)
In genuinely collaborative work, everybody's attention focuses on the subject-matter of mutual interest and the methodology of dealing with it, not on any of the collaborators. Colleagues do not treat each other as "subjects". Instead of trying to explain a colleague's behavior, one tries to fathom his meaning. One does not presume the other is necessarily "right", but one does "take him seriously" [(e.g.: "What might he see that I don't that makes him say this thing which doesn't seem to make sense to me?")]. Each seeks to understand what all are trying to say, and reflects on everything --- his own ideas, the others' ideas and the evidence --- by the light each sheds on all. A dialog arises, and it does not constrain individual freedom; it expands freedom by offering to an individual's consideration meanings of which he would otherwise remain oblivious. A dialog is a real unity that can exist only if the participants remain autonomous.
An alternative to the tyranny of the one-way mirror is friendship (not just the human analog of dogs sniffing each another, but a shared adventure in reflective thinking among persons who respect one another). Instead of managing people's lives --- others' or our own ---, we can experiment in enriching our shared participation in freedom. We can walk away from the mirror, and, attempting to hold ourselves and one another accountable for our words and acts, meet face-to-face.
"man is held into the everlasting now of the question,
into the everlasting now of his knowledgeless-knowing,
into his divine prescience,
knowledgeless in that it asks and must ask,
knowing in that it precedes the question,
as his innermost human necessity,
for the sake of which
he must put his perception to the test again and again
and be proven by it again and yet again,
man trepidant for the answer, perception trepidant for the answer,
man bound to perception, perception bound to man,
both held together and trepidant for the answer,
overcome by the divine reality of fore-knowledge,
by the magnitude of reality embraced by the knowing question,
a question never to be answered by the truth of earthly knowledge, and yet
which can be answered, must be answered here alone in the realm of earth,
verily man is held into his task of knowing,
and nothing is able to dissuade him,
not even the inevitability of error,
the bound nature of which vanishes before
the task beyond all chance..."
 I urge the reader to consult Philip Kraft's book, PROGRAMMERS AND MANAGERS: The Routinization of Computer Programming in the United States, published by Springer Verlag, New York, 1977, for further information on this topic if he is not already aware of what some of the proponents of 'structured programming' and other 'improved programming technologies' have tried to do to programmers and programs. See also Gerald Weinberg's [what might almost be called Stalin- or McCarthy-esque] review of Kraft's book, and Kraft's reply to Weinberg's review, in October 1978 DATAMATION (pp. 210-213).
 Seymour Papert's book, MINDSTORMS (Basic Books, New York, 1980), illustrates what I am talking about here, although in the context of elementary-school children, rather than working adults. MINDSTORMS is about the LOGO project at MIT, where children explore mathematical concepts, and their own algorithmic thought processes, by programming a computer. The book describes the young 'programmers'' progress in understanding the nature of algorithms, and also their progress in self-knowledge, as they discover how to teach the computer to do what they want. In this process, the children gain a sense of mastery over their environment, as they see the computer do what they told it to do, and understand the steps the computer followed to do it. My education was the opposite: My environment tried to master me. I was programmed ("taught") by people who were, perhaps unwittingly, exploring concepts of psychological manipulation. They had only equivocal success getting me to do what they wanted, and neither they nor I made any progress in self-understanding from the ordeal. I resented being manipulated (although I could not then have formulated the problem in those terms). My 'programmers' were frustrated by my 'disobedience', and probably wondered what they did wrong [more precisely: what was wrong with me!] that caused their instructions to fail. Even today, three decades later, I still suffer from having having been the direct object instead of the subject of the verb "to program". My erstwhile instructors probably never did realize that their efforts were doomed in advance, because I was nothing instructable, but rather an instructor. (How many children have suffered and continue to suffer from this tragedy, which, to me, seems akin to raising them as dumb animals?)
 "Alas, the time of the most despicable man is coming, he that is no longer able to despise himself. Behold, I show you the last man.... 'What is love? What is creation? What is longing? What is a star?' thus asks the last man, and he blinks. The earth has become small, and on it hops the last man, who makes everything small.... 'We have invented happiness,' say the last men, and they blink." ---Friedrich Nietzsche (THUS SPOKE ZARATHUSTRA, "Zarathustra's Prologue", Section 5, in THE PORTABLE NIETZSCHE, published by the Viking Press, New York, 1954, pp. 129-130)
 To say this another way, might sentences like "And the programmer (architect, physicist...) saw that the line of code (the placement of the window, the tracks in the cloud chamber)..." be our equivalents for biblical sentences of the form "And God said to..."?
 These thoughts suggest an important consideration in designing computer systems for 'end users' (i.e., people who are not professional programmers). There is a widespread belief that users should be presented with an environment as 'natural' as possible, where 'natural' means the kind of things that were commonplace in the pre-computer era of human culture (e.g., systems with which the user can communicate in natural language (preferable spoken, not typed), text editors that show the final text directly on the screen as the user enters it instead of requiring him to indirectly specify what the document will look like thru a markup language, etc). This seemingly obvious objective may often be inhibitory. Instead of (or at least in addition to) aiming to conform the computer to the user's current habits, perhaps we should give him an opportunity to overcome his habits in the direction of appropriating the computer's powers in the most effective ways. This endeavor will often involve the user mastering new ways of expressing himself ('artificial' syntaxes). Neither is the process necessarily painful: the user may be 'turned on' by a new way of doing things that is elegant, and that enables him to see better ways to do things in his area of expertise [(e.g., physicists using the APL programming language)]. The key to facilitating friendship between users and computer systems is not to give users the illusion that nothing has changed when it has changed (like molding phony stitching in vinyl automobile upholstery, or making 'woodgrain' formica desk tops), but to give them interfaces that enable their creative potential to venture forth, first with ease into areas that were were previously known but refactory, and then into areas the very existence of which they did not suspect.
 I thank David N. Smith for helping me formulate more clearly this idea: A tool people create to solve a known problem often ends up changing the problems the people deal with to new (different) problems the people never even imagined before the tool existed. David said that to design a software system by giving its intended user what he thinks he needs, i.e., (1) asking the user for detailed specifications what the system should do, then (2) implementing the specifications, often is not good enough. This paradox arises because they system, once implemented, changes the work the user does. By its very success in meeting the original requirements, the system ends up failing to satisfy the user's (by it altered, new) needs. David reminded me that insight changes what people do, in addition to changing the way they go about doing whatever they do. Medical science provides examples of new technology changing the problems/requirements. Effective polio vaccines save the United States each year more than its entire medical research budget (John Funder, SCIENCE, 7 December 1979, p. 1139). But who would take those savings into account in the 1984 budget? That the general progress of medical science in solving people's requirement for long and healthy life created the problem of a world 'population explosion' shows how giving the user what he thinks he needs may be far less than 'good enough'.
 This paragraph derives almost verbatim from Emmanuel Levinas' book, TOTALITY AND INFINITY (see Bibliography [not here included; however, the user may wish to consult the bibliography for my doctoral dissertation, which, though a decade later, builds on some of the things I read when I wrote this document, and which proved of lasting value]).
 All 'work' (labor) produces something (a product); good work produces a work. A work is not just anything made, but something high. The term "masterwork" is redundant; the phrase "minor work" is not. One speaks of J.S. Bach's "works". To speak of The Beatles' "works" is already incongruous. Most work in our society does not produce any 'work'. But, to make a work seems to me the reason to work (at least insofar as one is not in physical need...).
 "...when it wants to" ---This is a potentially misleading image, which recurs, as a leitmotiv, throughout this paper. I do not mean to attribute an anthropomorphic faculty of will to 'the solution to a puzzle' (or to what I will speak of, generally, as "the new"). It would be more correct to say only: "The solution comes when it comes." I retain this image of 'will' in regard to something that probably does not have a will, intentionally, though with misgivings, to highlight how dependent human willing (which may take national, ecclesiastical or tribal, as easily as 'secular' and 'personal' forms) is: Even when it commands nuclear fusion and computerized dossiers, human willing is 'trite', not just because it is empirically weak (it offers no prospect of raising the dead...), but also because all it can do is wear away (at-trite) what it already has and is entirely incapable of bringing forth anything genuinely new. As David Hume discovered when he looked into the mind and found there only faculties for producing relations of association, resemblance and contiguity, human willing can only mix and match what it already possesses (do 'more of the same'). But, to use Jacob Bronowski's word, human existence also ascends, and ascent is vertical (something decidedly better, 'a cut above'), not horizontal (more of the same). None of the things that make life ascend can be produced by human (Humean) willing. "The solution to a puzzle comes when it wants to" emphasizes this dependence of man on what is not his, the importance of what John Wild and some others have called "man's openness to otherness": a flash of insight, or, sometimes, a smile. Man cannot will these things (what he can will is to keep himself open to the possibility of their coming, by treating all his accomplishments and plans as questionable and always keeping clearly in mind that the things that really matter are not and never can be brought under his control). Our ascent (and even our having made it to where we already are) depends on what is radically 'other' coming to us, "from out of the blue". Let me use this last image, suggestive as it is of UFOs of a triter sort, to speak my hope: May our human willing --- all our making plans and implementing them --- have as its aim to make our planet as a whole, by making each person as an individual, a brightly-light landing pad for new ideas and journeying spirits (among whom we should not expect our next-door neighbor to be any less strange than a visitor from another galaxy). May we be always alert to interrupt construction whenever one of these 'vehicles' approaches, to help it 'come in', to greet it, and also, to change our plans according to the news it brings (how can we already be sure what form we 'airports' should take, when our function is to receive what we do not yet know?). Finally, may we 'landing pads for othernesses' each know he is an otherness to every other --- each of us a journeying spirit ---, that landing is not the end of one's journey, nor even a pause in it, but entry into an other's journey (...compounded motion). ---This is the sense I wish to convey by associating such phrases as "when it wants to" with the coming of the 'other' (e.g., the solution to a puzzle). [ See also country music songwriter Mary Gauthier's description of the creative process: Quote #231.]
 I find the deeply nested 'If...Then...Else' constructs in 'structured programs' cumbersome, and the strung out 'case' logic, iteration and 'sequence' code wearying. I, too, try to eliminate GOTOs, but not because I selectively abhor GOTOs: I try to eliminate everything. I try to eliminate IFs and ELSEs. I try to replace many instructions by a few, or better yet, abolish them without replacement. Getting more function from less code 'turns me on' (even if it takes me more time to write it and/or the result is understandable by less people). But I find no merit in disguising an honest 'branch' as a contrived 'conditional', or in any of the other contortions 'structured programmers' go to to hide its nakedness. (When revisionist 'structured programmers' call a 'branch' an "ITERATE" or "LEAVE", are they tacitly admitting they themselves do not really like the kind of programs their principles, taken literally, produce?) For a long time, ever since I learned a little about recursive function theory, I felt there must be something better, and I found at least part of what I was looking for in APL: Function definition and composition shrink sequence code. Arrays and recursion eliminate iteration. Imperative expression evaluation replaces case logic and further shrinks the sequence code. (APL is a programming alembic: the intensity of meaning that can shine in a one-line APL program, and the sensuous delight in working with APL symbols --- they are like friendly little animals that want to play with you ---, show one way programming can be art. APL may be a work of art. [--UNIX, especially Perl 5, regular expressions are another example here.]) Good structure, in the sense that a program does something good, does it in a direct way, and clearly shows that and how it does both, may almost constitute a definition of good programming. But this is a matter of clear insight and fluent expression, not of employing only officially sanctioned 'control structures' to format the code. [1998 note: After writing GOTO-less programs for over 10 years, I now find them "second-nature", just like anything persons do over and over again comes to seem "natural" unless it proves entirely self-destroying, in which case, obviously, the persons perforce cease to do it. That doesn't make it be good, just taken-for-granted (like other dubious social customs, such as ritual male and female "circumcision"...), so that nobody thinks about whether or not it should continue to be done.]
 In PROGRAMMERS AND MANAGERS, Philip Kraft distinguishes between 'structured programming' as (1) a conceptual model in computer science, and (2) a practical data processing management tool. Even as a conceptual model, it may be infelicitous. As a DP management tool, its effect on programmers, by reducing their craft to fragmented assembly line labor --- a sort of 'Gulag of the Mind' ---, can be devastating.
 It is interesting to note that a college degree is often a minimum requirement for entry level programming jobs. This is doubly perverse. First, programming appears to be a natural aptitude. A born programmer 'takes a code like a duck to water', with or without any formal education beyond elementary reading, writing and arithmetic. Several of the best programmers I know do not have college degrees. Conversely, I continue to be amazed by university graduates (some of whom have earned advanced degrees) who spend their days docilely performing ersatz structured 'programming' tasks their less schooled (but more spirited) fellows would reject out of hand as useless and degrading.
 One may object: Many programmers do write programs 'for its own sake', in their 'spare time', on their home computers. Unfortunately, that is correct. How much creative potential is lost to society, and how much satisfaction in public accomplishment is lost to the individual, when people dissipate their energy on private diversions instead of investing that energy in their work? How can jobs be so shaped that persons want to do their work? The change from a consumer society to a producer society would go a long way toward healing the spirit of modern man and solving the world's present productivity problem.
 It is not enough for a programmer to do a better job of the job he has been assigned. He should think farther, and ask if the assigned job is the right job to do, or whether something else, or nothing, should be done instead. The programmer should consider whether 'successful' accomplishment of the technical task might make things worse (e.g., by exacerbating unfortunate social relations). In computer system design, one of the first programming questions should always be what social structures ought to exist. The best 'program' will sometimes be a program for social change. When the technological worker lets himself be used as a tool, when he becomes part of the machinery (a 'technician'), when he restricts insight to technical matters and implements whatever he is asked to make (perhaps because "the function of programming is to write programs that meet user needs"), then, if he does his job well, he may make good Zyklon-B. To write code is to shape people's lives. 'Human factors' is an integral part of the programming task. But it is not confined to making systems easy to use. It extends to evaluating what the systems will be used for. Whether a user talks with the system in natural language or in a cumbersome artificial syntax is a less important human factor than whether (s)he is an independent consultant or a clerk.
 What is "the new"? It is the event of insight. It cannot be described, but it cannot exist apart from awareness of it. It can come to us in the "Aha!" experience (the phrase "'Aha!' experience" comes from TECHNOLOGY REVIEW magazine). Maybe [an instance of] it is 'the slab' in Arthur Clarke's file 2001. It is 'the light of the mind which is the light of the world' ('lux mentis, lux orbis'). Yet we may be oblivious of it[, somewhat like Heidegger's example of the persons looking for their eyeglasses and not finding them because they're already wearing them and looking through them]. [There is a serious issue regarding "the new": Not everything unprecedented is desirable. The word "neoplasm" points to the problem. And it will not do to say, e.g., that cancer is "nothing new". AIDS revealed previously unimagined potentialities, but they were in no way expansions of the horizon of experience which persons would generally welcome.]
 In the Ninth DUINO ELEGY, lines 68-71.
 There is a third kind of activity in contemporary society that is neither repetitive and traditional (craft), nor radically self-critical (science): fashion. Unlike craft, it pursues change; but, since, unlike science, it does not pursue self-understanding, any improvement is largely unintended. It generates 'novelty', which, on the "outside", looks different from what came before, but usually, "inside", is no different, like Detroit's annual model changes of the Fifties (if there is anything really different "inside", you can't tell that from the outside, because the engineering and the styling --- the coding and the designing --- are separate). By calling itself "progress", fashion has helped give that word an undeservedly bad reputation. But novelty is not the new. Gadgets and gizmos are not renewal. With or without tail fins, a person still has to drive to work; his relation with tail fins does not transfigure his job or homelife or help him understand either (although polishing them may be the closest thing to the joy --- the light --- of real creative insight in his life).
 This section owes much to Jacob Bronowski's television series, THE ASCENT OF MAN, and to Thomas Kuhn's book, THE STRUCTURE OF SCIENTIFIC REVOLUTIONS, published by The University of Chicago Press, Chicago, 1970.
 The analogy between programming and physics is not impaired by the obvious difference that men architect computer instructions but not electrons. Programs, like all human productions, are based on a substrate we did not create, and are, therefore, more than what we think they are. That makes them as legitimate objects for analysis as things in nature. That physics has built up an extensive body of scientific 'law' whereas programing has not, is due neither to programming's youth, nor to any deficiency in its 'scientific-ness'. It is due, rather, to the fact that once we learn how computer instructions behave, we can change them, to make them more to our liking. If we could re-architect elementary particles, the corpus of physical law would be much smaller than it is.
 Example: The failure of the Michelson-Morley experiment to detect any effect of the earth's motion on the speed of light.
 People often give some seemingly descriptive name to the event whereby the new comes from the elsewhere: 'the unconscious', 'intuition', 'genius'.... They imagine they have thereby categorized it, domesticated it, included it in the aggregation of things that exist in the world. BUT THE NEW REALLY IS NEW EACH TIME IT HAPPENS. If a person does not yet know what he does not yet know, then, since the new is that, nobody can know anything about it, and all the words like 'the unconscious', 'intuition' and 'genius' are wind eggs. The new is not something people can understand; it is 'something' they can welcome. ---See Henri Poincare's essay "mathematical Discovery" (in SCIENCE AND METHOD, Dover Publications, Inc., New York, 1952, pp. 46-63), for one person's description of his experiences of it.
 The kind of event I have in mind here (to use an example from Norwood Hanson's book, PATTERNS OF DISCOVERY) is what Kepler may have experienced when he saw the orbit of Mars as an ellipse, instead of as compounded circles. It is something like 'seeing the figure' in a Gestalt psychology puzzle, where previously one saw only color patches. In Chapter 10 of THE STRUCTURE OF SCIENTIFIC REVOLUTIONS, Kuhn discusses this analogy and its limitations.
 There is another phase to this process, the phase that completes the cycle back to 'ordinary' science. It is the scientist's critical examination of what he has received. This interrogation installs the new (as truth and world), or leads to its rejection. It shows that man, though he does not command the new, is not merely commanded by it, either: Man is its correlate. ---These considerations suggest a non-leveling notion of human equality: Man equal to the new; each person equal to the task of receiving and testing and communicating truth.
 A person chooses to open himself to the coming of the new, by, e.g., becoming a scientist, and doing everyday experiments as conscientiously as he can, keeping ever alert for ???. But he cannot anticipate what will come of his vigil. He freely signs up for something about which he can know in advance only that it will probably turn out quite differently from the way he expects it will. A programmer may encounter this radical indeterminacy when he faces a system that has just 'crashed'. Such events are our equivalent of the adventures of the heroes of myth: beyond all things seen and said (from the dangers faced by Jason and Odysseus to the Big Mac one ate last night), insight --- human seeing and saying itself --- is the finally final and truly unbounded frontier.
 It also derives from a root that means "a good omen". Footnotes 26 and 27, immediately below, indicate the kind of 'a good omen' insight, which I call "holy", may be.
 Insight presents truths. Truths are speakable. Speech is communication. Therefore, insight, the relation with things, points beyond itself to the possibility of the community of spirits: friendship. Friendship is 'higher' than insight. But insight retains a place in the structure of the holy, first, because one may be alone, second, because, without insight, one may not be able to recognize an other at all, or, even if one still can recognize him, without insight, one can only greet him 'empty handed' ("I write programs as a way to approach those to whom I would go, and as a way to welcome those who would come to me: What I see is what I have to give to you.").
 Robert Musil put the matter thus: Western man long ago took a wrong turn in his understanding of the mystical. Musil thought the mystical should be sought in the most exact and lucid experiences, precisely in their clarity (e.g., in scientific insight), not in anything obscure. Might the 'ordinary' event of insight, which occurs whenever a person understands anything, if carefully attended to, show itself to be extraordinary, and even lead to ecstasy? (On the 9 Nov 1981 National Public Radio MORNING EDITION program, Dr. George Leonard, author of THE ULTIMATE ATHLETE, said something that bears on this: Really good athletes often have mystical experiences --- Leonard cited time slowing down and auras --- during their athletic performances. He also said the athletes, while admitting these experiences to him in private, deny them in public, because they fear damage to their careers. I ask the same thing for computer programmers that Leonard claims for football players: that a vast stillness open up for a programmer entering a line of code on his terminal, as it may open up for a quarterback looking for an open receiver, and that an aura may appear around a key computer instruction, as it may appear around a fullback.)
 John Wild, EXISTENCE AND THE WORLD OF FREEDOM, Prentice-Hall, Inc., Englewood Cliffs, New Jersey, 1963, p. 85.
 Colin Rowe discusses the cause of "modern architecture" in his introduction to FIVE ARCHITECTS, published by Oxford University Press, New York, 1975. There, Rowe makes a strong and welcome plea for pluralism. But I think he fails to penetrate thru to pluralism's full promise. Pluralism is not just a negative virtue --- 'toleration' ---, by which people agree not to try to run each other's lives. Nor is its highest promise diversity ("let a thousand flowers bloom"), although that is a benefit of it. Pluralism is the positive condition for persons to be able to give gifts and receive them, provide hospitality, love one another..., when these events are considered not in their sociologically common forms (sending mother flowers on Mother's Day, being greeted by a motel clerk, hugging one's kid today), but as what, in the back of our minds, we hope to find ("the real thing"). The real thing can occur only if the other person is a mystery, if (s)he is radically "other" than oneself: "another world". When one 'understands a person's motivation', his gift, hospitality and love are diminished (if the question "Why does B love A?" has an answer --- because A is rich or beautiful or reminds B of his mother, or because B pities A or doesn't want to be alone, etc. ---, then B's "love" is vitiated). The promise of pluralism is adumbrated in Emmanuel Levinas' assertion (TOTALITY AND INFINITY, p. 79): "Everything that cannot be reduced to an inter-human relation represents not the superior form but the forever primitive form of religion." (These reflections suggest a reason for 'private property', beyond economics: that a person may have something to give to a friend who does not need it, e.g., an invitation to his home for dinner and to spend the night, when the invitee has his own comfortable abode with a well-stocked refrigerator. They also suggest a positive goal for all business enterprises, be they in the 'private' or 'public' sector: to help each worker become economically independent, so that he may exercise his skills as a free man, and neither use others (as an 'employ-er') nor be used by them (as an 'employ-ee').)
 Not just doing science, but a scientist's espousal of a particular scientific theory is an act of faith. It is something much deeper than a 'hypothesis', and it is not just 'ego identification'. Thomas Kuhn notes, in THE STRUCTURE OF SCIENTIFIC REVOLUTIONS, that scientific theories are rarely if ever abandoned because of the discovery of 'falsifying evidence' (as one prevalent philosophy of science would suppose). The 'falsifying evidence' can usually be explained away somehow, or, if not, it can be relegated to the backlog of unsolved problems (which every theory has), to be dealt with later, after the theory will have been 'developed further'. A new theory, Kuhn continues, does not gain adherents because it explains more of the data. At first, it usually explains less! The theory gains adherents by inspiring people's commitment (e.g., by making sense of some particular piece of data the people find especially interesting...). Finally, the adherents of a 'discredited' theory are rarely converted to the new theory. More often, they die off without being able to attract members of the younger generation to carry on their work.
 The character WOZZECK from Alan Berg's opera comes to mind here. And to call structured spec-ers and coders "analysts" and "programmers" is like calling garbage collectors "sanitation engineers".
 To work on something that is uninspiring, and/or among people who are uninspiring, impoverishes my life by denying me opportunities to share with others what I have learned, and to learn from them. Quality in the product of our work, and encouragement of independence and growth in co-workers, 'pay off' from a selfish point of view.
 The ideal of the life of science is presented in Orson Wells' 1936 film, THE SHAPE OF THINGS TO COME: The scientist does not demand anything from anybody, but he freely gives his knowledge to all. He ventures into the unknown, into extremist danger. He does not ask anyone else to risk anything. But he does resist attempts from wherever to prevent him, and those who may wish to join him, from risking themselves for knowledge.
 Edmund Husserl, THE CRISIS OF EUROPEAN SCIENCES AND TRANSCENDENTAL PHENOMENOLOGY, Northwestern University Press, Evanston, 1970, pp. 23-59.
 What is freedom? The option to select between one thing and another is only part of its husk. Freedom is encounter with the 'other' (the new, or an other person). It lives in human openness to listen and respond (human 'response-ability'), and to see. A scientist's freedom does not consist in any choice he has between doing one experiment or a different one. It consists in his awareness that science exists only in his openness: (1) to listen to whatever question science may ask him, here, now, and (2) to give himself to the task of answering that question. "Freedom of choice" often exists where there is little real freedom (e.g., where a consumer picks between brands of interchangeable consumer products, but (s)he does not think about what it means to be a consumer). Freedom is the gift of being able to give to and receive from things and persons, thematically given to and received by itself ('self-conscious'). It is seeing a hole in the tyranny of "the real", recognizing it as such (a 'break-thru', or, sometimes a 'break-down'), and, like Alice, going in.
 ---To "judge", in this context, may sometimes mean to bless, as in Walt Whitman's statement: "The poet judges not as a judge judges, but as the light falling on a helpless thing".
 See June Goodfield's book AN IMAGINED WORLD: A STORY OF SCIENTIFIC DISCOVERY (Harper & Row, New York, 1981), for a biographical account of the experience of one scientist who characterized her relation with her subject matter as "physical...like making love". Consider also Louis Kahn's question to a brick: "What do you like, brick?"
 I once saw a Public Television program about eskimos. Among other things, it showed a master seal hunter make a kill. At just the right instant, he struck, surely and swiftly. Immediately, he exclaimed: "I almost missed!" ---Even after we have succeeded in what we set out to do, it is important to remind ourselves that we might not have, and that our best efforts were a necessary, not a sufficient condition for success.
 These are necessarily anonymous, since names name only objects, while insight is the event that makes an object be an object.
 On the subject of demythification, consider the following excerpt from an address by J. Robert Oppenheimer to a 1965 UNESCO gathering honoring Einstein on the 50th anniversary of the general theory of relativity: "I thought it might be useful, because I am sure that it is not too soon---and for our generation perhaps almost too late---to start to dispel the clouds of myth and see the great mountain peak that these clouds hide. As always, the myth has its charms; but the truth is far more beautiful." (SCIENCE, 16 May 1980, p. 698)
 I was born in the U.S.A. My parents were born in the U.S.A. Their parents "came from" Ireland and Poland. My "roots" are not in any soil, but in Broch's words, in Heidegger's words, in Laurie Anderson's words, in all and only the words that speak to me, here, now, and in the words I speak and the programs I write. Even this is a kind of tradition, but with a difference. It is the 'tradition' of reflective thinking, which honors its past by questioning it, and would make a 'habit' of making its habits justify their continued existence. It is a 'faith' that is perhaps expressed in Martin Heidegger's words: "I have abandoned an earlier position, not to espouse another, but because even that position was only a temporary rest stop on the way ('Unterwegs'). What endures in thinking is the way ('Weg')." And: "Questioning is the piety of thought." Or, for an American example, the text of William Ellery Channing's 1819 "Baltimore Sermon": "Prove all things. Hold fast that which is good."
 But see WORK IN AMERICA: Report of a Special Task Force to the Secretary of Health, Education, and Welfare, published by the MIT Press, Cambridge, 1973. This study concurs with my thinking on several key points. It also contains a bibliography of supporting sociological literature.
 What about those who cannot participate? Miners? Assembly line workers? I don't know. But, for starters: (1) If people consumed less, fewer would have to do intrinsically undesirable work. (2) Technological advance can reduce the need for people to do such work. (3) If some people, in becoming more productive, also 'have the time of their lives' doing their job, that has not hurt anybody. (4) Many jobs are undesirable because social engineers have designed the meaning out of them: maybe these jobs can be 'redeemed'.
 "Zek", i.e., a prisoner in a Soviet labor camp.
 REMINISCENCES OF LOS ALAMOS, 1943-1945, ed. Lawrence Badash, et al., published by D. Reidel Publishing Co., Hingham, Massachusetts, 1980, p. xxi. ---Los Alamos shows how personal satisfaction --- joy in life --- for the worker, and social productivity, can not only coexist, but nourish each other. When programmers are unproductive, the problem is often in their jobs, not in them. Richard Feynmann describes an example from Los Alamos (REMINISCENCES, pp 126-8): Tab machines were organized into a primitive 'computer'. Complicated calculations were sub-divided into tab-machine size steps. The input data was punched onto cards. An operator would feed the card deck into the first machine in the 'program'. He would take the processed cards from that machine and feed them into the next machine. Etc. The operators were treated as organic data buses. They were told only: take these cards from here and put them there (they did not know they were working on "the bomb"). The output was one problem per three months. Feynmann was assigned to improve the productivity of this bottleneck on the project's critical path. Where a lesser person might have conducted a time-and-motion study, Feynmann requested special permission to tell the operators what they were working on. (He got it.) The result: "...They were all excited: 'We're fighting a war! We see what it is!' They knew what the numbers meant... Complete transformation! They began to invent ways of doing it better... They worked at night. They didn't need supervising... They didn't need anything... And all that had to be done was to tell them what it was, that's all... We did nine problems in three months, which is nearly ten times as fast." Such are the gains that can come from letting people into the intersubjectivity of living discourse one is, instead of manipulating them as objects --- gains for the people, the work, and oneself (everybody 'won' here: the work got done much faster; the operators' lives were invigorated; Feynmann himself got real satisfaction, and successfully completed his assignment). It is perhaps worth noting that Feynmann was one of the 'wild ducks' at Los Alamos (he spent considerable time figuring out ways to circumvent site security; he compiled a covert list of all the padlock combinations by unobtrusively looking over people's shoulders, and then used this information to con the people into thinking he was a master safe-cracker...). Questions: If Feynmann had told the operators the purpose of their work as to gain a competitive advantage over the XYZ Corp., or to optimize the hit-ratio on the bulk-mailing promotion, would their reaction have been the same? If that were the purpose of the work, would Feynmann have done it?
 See, for example, DISCIPLINE AND PUNISH, published by Random House, New York, 1979. Surveillance of people, not just in overt social science experiments, but also in the daily routine of administrative institutions (schools, social welfare agencies, prisons, lunatic asylums, etc.) builds an ever expanding base of data for social science to analyze. Analysis of the data generates results that energize the policies of the administrative institutions. The institutions then collect more data, which, combined with the old data, forms a bigger base for further analysis (both more and 'meta', in a way reminiscent of the construction of higher infinities in set theory). Ever more of society's resources are demanded to cope with the ensuing "information explosion", until statisticians project collapse at the point where people can no longer process all the data. But, as Joseph Weizenbaum noted in COMPUTER POWER AND HUMAN REASON (W.H. Freeman and Co., San Francisco, 1976), the computer, as Superclerk, arrived in the nick of time to process the data after all (in the old way, but far faster than mortal men). Some of the largest computer systems have been implemented to enable banks, social welfare departments, and the military to carry on 'business as usual', when the volume of data exceeded what could be handled by clerks (there were alternatives, but they might have involved changes in the power relations between people). Thus, the computer has served as one of most significant conservative social forces in our time. Weizenbaum concludes: If a 'revolution' is something that effects radical change in the structure of society, there has been no computer revolution.
 Quote is from Juris Hartmanis's remarks in the June 1981 issue of COMMUNICATIONS OF THE ACM (p.354). Hartmanis directs this disparaging image against a customary conception of computer science. I find it applicable to the social sciences in general.
 Or by a manager hiding behind 'the plan', a teacher hiding behind 'the curriculum', etc.
 There are exceptions. Their work is often dismissed as 'unscientific' by colleagues. Philip Kraft said of himself (op. cit., p. 5): "Whenever I reached some sort of conclusion, I wrote it up and showed it to several of the programmers I had talked with earlier for their comment and criticism... If people are good enough to let you bother them with questions and constant hovering around, they have a right to learn what you've learned and to know what you think of them." The kind of vituperation to which such heresy may subject a social scientist is exemplified by G. Weinberg's review in October 1978 DATAMATION (see footnote 1).
 Michel Foucault: "It is not through some advancement in the rationality of the exact sciences that the human sciences are gradually constituted [,...but rather by] the reorganization of right that invests sovereignty, and...the mechanics of the coercive forces whose exercise takes a disciplinary form." (POWER/KNOWLEDGE, Pantheon Books, New York, 1980, p. 107)
 Sophocles, ANTIGONE, translated by Dudley Fitts and Robert Fitzgerald, in: THE OEDIPUS CYCLE, published by Harcourt, Brace and World, Inc. (Harvest Books, HB8), p.218.
 Of course, even billiard balls have a kind of 'inside'. But that is just another 'outside' hiding behind the outside one sees, like the layers of an onion. A person has this kind of 'inside', too (his organs). But he has another 'inside' of an entirely different order: his 'self'. No matter how much one tears into his body, one will not find this [other] 'inside' anywhere. It is "elsewhere". Yet it is not far to seek, as, for example, if I ask you: "What do you think of this paper?" ---Consider one more time what solving a puzzle is like, and what it may tell us about the 'puzzle' of human existence: One can tear a puzzle to bits, and crush it, so that no one can ever play with it again, nor even have an opportunity to know there was a puzzle. But that will not make the puzzle yield its solution. To destroy a puzzle one need not know what puzzles are nor that the thing one destroyed was one. To have any hope of solving the puzzle, one has to respect its integrity: One has to refrain from doing certain things to it. One has to 'give it space'. Then, the solution may come (or still it may not). Orthodox social science, discipline and training (of 'children' and 'workers'), and, a fortiori, certain kinds of brain physiology experimentation, are all attempts to 'force the puzzle'. The only plausible 'success' of that will be to destroy the possibility for human existence to...exist.
 "Information about lawlike connections sets off a process of reflection in the consciousness of those whom the laws are about. Thus the level of unreflected consciousness, which is one of the initial conditions of such laws, can be transformed. Of course...a critically mediated knowledge of laws cannot through reflection alone render a law itself inoperative, but it can render it inapplicable." (Jurgen Habermas, KNOWLEDGE AND HUMAN INTEREST, Beacon Press, Boston, 1971, p. 310)
 I use the word "friendship", instead of words like "brotherhood", "sisterhood", "the family of man", to draw a distinction and make a choice: Familial relations are (biologically) fated. Friends are chosen in freedom.
I have now (March 1984), however, more or less ceased working on the document, not because it is "finished", but because --- largely thru the process of working on it --- I have gone on to other things, and perhaps "outgrown" it. I still believe in the importance and overall soundness of what I wrote here, but my life has changed so much that present and perfect tenses have become past tenses. Among other things, in September 1983 I returned to school to study some of the issues in the relations between persons and technology which this paper was a first attempt to address, and I have not done any computer programming for over half a year [the hiatus in programming work was temporary; apart from another 18 months leave of absence beginning in summer 1993, to complete my doctoral dissertation, I have continued to have to earn my living doing computer programming of one kind or another, at least through July 1998)].
I thank the persons who took their time to read the manuscript at varying stages of its (and my own) development: Daniel Fetler, Tom Gee, Fred Hennard, Mark Lindquist, Robert Malstorm, Harlan Mills, Buck Rhodes, WIlliam Rubin. Their feedback, regarding both content and style, has been of invaluable assistance. Needless to say, they often disagreed with what I wrote.
|Leisure is the basis of culture.|
|The Final Frontier: Has IBM broken the Discovery Barrier?|
Our Century: "The century of barbed wire".
|What's new on this website?
Go to website Table of Contents.
Return to Brad McCormick's home page.
Return to site map.