The effects of technology on Business: Technologically induced numbness


The Effects of Technology on Business



A paper delivered at the 3rd Internation Conference Promoting Business Ethics, Niagara Falls, November 1, 1996.


back

We need technology, and yet every new technology places new demands upon us and creates new forms of stress. We can't live with it, but we can't live without it. There is no turning back to some pre-technological Eden. Aristotle rightly described man as an animal that lives by technology. The human race lives by art and reasoning (techne kai logismois). 1 Other animals come as complete packages. Their sense powers and instinctive programming are infallible within the limits of their particular ecological niche. Their organs (their hardware) and their instincts (software) are tailored to specific activities. As Aristotle said, they sleep with their shoes on.

The other animals have each only way method of defence and cannot exchange it for another. Whereas the human is physically weaker, by the use of his hands he can create any tool that any other animal has. 2 Our weakness is our strength, as the lack of any specialized defensive organs makes us free to be versatile. Likewise, we are poorly equipped with instincts.

When we are born, we are able to breathe and swim by virtue of an interior program, call it an instinct. At a certain point the breathing instinct starts to fade and the newborn child must make a conscious effort to breathe. If he fails to make the transition, so it is thought, the result is Sudden Infant Death Syndrome. Our entire behavioral repetoire is made up of what Aristotle would call "art" or "techne". Interestingly enough, the child learns with greater ease how to breathe if he sleeps in proximity to his parents, as has been surmised by the low incidence of Sudden Infant Death Syndrome among cultures where that is the norm. Thus, even breathing is a piece of know-how passed on culturally.

In a broad sense, then, technology forms our environment. This environment remains unperceived unless we are separated from it, as a fish does not know what water is until it is beached. The particular technological environment wherein we are nurtured is incorporated into our being. It forms who we are. We do not need to make any special effort to learn it. Rather, it is learned by absorption. Marshall McLuhan noted the ease with which we learn when learning is put in an environmental context. We learn our native language environmentally and effortlessly. McLuhan wrote in 1968:

Archibald McKinnon, Dean of Education at the University of British Columbia, has been carrying out experiments in recent years to illustrate this principle. He has been borrowing natives from the deepest jungles of the world and exposing them to sophisticated situations wtihout any training in our Western knowledge. For example, he allows them to explore four-engine jet airplanes, and within three or four months they not only fly them but can assemble and repair any part of the mechanism of these planes. 3

We probably all have personal experience relating to how people relate to computer technology. Some people, children more than adults, jump right in to using the machine. Others view the machine with apprehension, hitting the panic button every time they cannot make it perform. Those who adapt to it treat it with the same familiarity as one treats a pet animal, and they learn quickly by trying different things. The apprehensive despair of learning the theoretical intricacies of files and directories. The computer, then, is for the first group part of the environment. For the second group, it is an external irritant in the environment to which they have grown accustomed.

Some may differ, but I think that the telephone is every bit as complex to use as a computer, or as the internet, but we do not have community colleges offerring courses on the use of the telephone to the apprehensive. Neither do we have prominent politicians pushing for a phone in every classroom, as they presently push for Internet Terminals in every classroom. The reasoning would be the same. After all, since you can reach anyone in the world with a phone, wouldn't it enhance the learning process to let students spend their time calling people and learning? This is the approach some educators take to the Internet.

The phone is environmental, and we have more or less learned to put it in its place. We use phones as naturally as a dog wags its tail, and we miss the phone in its absence as if it were part of our body. Possibly when the Internet is environmental, the childish enthusiasm would subside. Another consideration is that the Internet technology will change constantly, especially as to the bandwidth, or the amount of information that can be carried, and to install hardware on a massive scale in classrooms, such as some politicians have proposed, would be to burden the system with equipment that would be probably be outdated even while it is being put in place.

Aristotle pointed out that every organ, and organon is simply the greek word for instrument or tool, can be understood only in relation to its utility to the organism. Every technology is likewise an extension of our own natural powers, and can be reduced back to some natural need. This is to say that technology can be understood in terms of final cause, or purpose, and that purpose is a purpose of the living human being. Every instrument or tool has objective effects, in terms of results both intended and not intended, the work done. The objective effects of technology are easy to study by conventional methods. Every invention comes from the desire to enhance an existing natural function, to accelerate it. This implies that the objective effects of technology can be studied in a quantitative manner. The objective benefits of one car over another can be measured by speed and fuel consumption. The deaths that result from the use of cars, compared with the time saved by the use of cars, and the pollution produced by various types of fuel are easily tabulated.

The subjective effects of technology are more difficult to study by conventional methods. The subjective effects are the many ways in which the user of a technology is shaped by its use. These effects cannot be treated simply in a quantitative manner. The assimilation of a new technology and technological environment creates a new type of human existence, and that is qualitative. Statistical studies based on what can be counted and measured, the accepted approach to the impact of technology, are of limited use. A statistical study is a priori limited in that a decision must be made to measure a certain factor, which means that the investigator must fix in stone what he is going to find before he finds it, and to reduce the complexity of change to manageable quantities. Gilbert Keith Chesterton pointed out with gentle sarcasm the self-imposed blindness of the scholar in the face of real change:

They were not merely mentioning the things they remembered, but remembering the things they were supposed to mention. Their minds had recorded only the things that were suited to the records and writing only the records that were suited to the official record office. 4


To be atune to the effects of technology, we cannot suppose that when some activity is accelerated by technology, the result will be simply that the same things will happen in the same way as before in every aspect except for speed. In nature, there are no linear processes that go on to infinity. If you continually heat water, you do not merely get continually hotter water, but at a certain point it boils. If a tap is dripping with regularity, increased flow does not merely cause it drip at a faster regularity, but the rhythm changes to non periodic, and then turns to a regular flow. Likewise, when the flow of information is increased, you do not merely have the same people doing the same things with added speed, but there are sudden qualitative changes in the way people live and perceive. Following past trends is not helpful, because the changes effected will be in areas that were not accepted as worthy of study. Chesterton writes of those who trust in official records: "They had not the disinterestedness or detachment of gossip." He writes:

Often they never hear the songs, because they are sung in public houses. Often they never hear of the arts and crafts, because they are not recorded in books. This sort of ignorance is indeed ignorance, but it is ignorance in arms, ignorance militant and triumphant, ignorance advancing with all its armies across a conquered land. 5


The point is that the specialist will always be outflanked by qualitative change. Every specialism is a creature of its particular cultural and technological environment. Specialized knowledge and practice is our human substitute for animal instinct. It works infallibly, as an instinct does, so long as it is in the context of its ecological niche. When that particular niche is destroyed, the specialist finds himself obsolete, just as a bird's navigational instincts fail when confronted with an image of the sun reflected in glass. Perhaps a more telling example, and I hypothesize here, is that of the turtles that swim vast oceans to lay their eggs. Perhaps at one time they were swimming a few miles, but with continental drift every year the breeding grounds and the feeding grounds were farther apart, as if a commuter from Buffalo to Niagara University found himself eventually travelling four days to get there. The mathematics graduate can adapt more easily to state of the art technology than the computer science graduate, since the technology taught in schools will always be several steps behind state of the art.

Technological change and the cultural change that is its companion force everyone to be both a philosopher and an artist. The attitude of the philosopher is that of taking nothing for granted, and that attitude arises only when some new perception jars the conceptual framework that no longer applies. The artist takes the environment we take for granted and puts it into relief, where we can see it for what it is. Many will be familiar with what Aristotle had to say about the value of general knowledge over specialized knowledge. The generalist may not be able to give a precise answer under every circumstance, but he can come to a correct general judgement under all circumstances because he has a grasp of general principles.6


In times of stability, there is not a great need to distance ourselves from the immediate demands that we place on technology and that technology places on us. Since, however, technology changes the way that we relate to the world, we do need to have a wider view in order that the processes of change do not master us.

The starting point in ethics is that we should be in effective contact with reality through knowing the truth. To put this in other terms, ethics is not an issue in a virtual reality or a dream. Ethics comes into play when we are deliberately acting in the context of the real world. Karol Wojty a wrote in 1958:

Underlying the whole system of norms based on nature and formulated by the reason (we know that these norms are at the same time contained in Revelation) one may present the principle: in all your activity remain in harmony with objective reality. This reality is made up, on the one hand, of the acting subject endowed with a rational nature, and on the other hand, it is made up of the whole series of object-beings which this subject encounters in his activity, each one of which has its own nature. This fundamental principle, the principle that one should remain in harmony or agreement with reality, both objective and subjective reality, in one's activity, is the gauge of realism in the whole of practical philosophy, and in particular in ethics. Ethical norms are based on reality. The same faculty of reason, which in its knowing attains to reality itself, also defines the principles of activity. It is therefore no wonder that the reason introduces as a fundamental axiom into its normative function the axiom of harmony or agreement with this reality. This is precisely the axiom of realism in ethics. 7


The ethical primacy of retaining a true contact with reality and harmony with reality is directly involved in the question of technology. First, the technological culture itself is a reality that we must be aware of. We must perceive what is real and actual, and what is obsolete and a mere appearance, and recognize the obstacles that block this perception. Here it is important to take stock of biases that we have inherited from a previous stage of technological culture.

Second, our own technological culture will heavily bias our perceptions of the world. The user of technology finds himself focussed in a certain way by the tools that he uses. The technology will impose a certain set of priorities. If it focusses us on one thing, it causes inattention in other areas. Any technology creates a new set of mental habits. This is especially true of computers, since they do not merely extend our limbs, but extend our cognitive and perceptive faculties. The entire ensemble of information technology changes our perceptual habits not merely per accidens, but by design.

The introduction of information technology speeds up the amount of information that we must deal with. The result is real pain. Aristotle noted that what we feel to be painful may largely be a matter of habituation. An animal organism is normally under stress. Sight and hearing are in fact painful, but we have become used to them over the course of time. Once accustomed to a certain level of sensory input, we feel the withdrawal of sensation to be painful or distressing. 8
The biologist Otto Lowenstein reports that when people blind from childhood recover their sight, the flood of added stimulation is unpleasant. The first reaction is to shrink back from it all. Initially, the newly sighted cannot discern distinct objects in the chaotic jumble of sensation. When they begin to discern objects and associate them with the objects they know from their other senses, the mental pain subsides:

"At first sight" the world looks like a flat extension of meaningless patches of light, dark, and color jumbled into a quilt work. One by one objects grow out of this chaotic world, and remain unmistakably separate once they have been identified. A meaningless jumble of shapes defies description, until the demonstrator has drawn on paper one or the other specific shapes to be searched for. The saying "seeing is believing" may fittingly be reversed in this context into "believing is seeing". 9


At a lower level, it is like Victor Frankl's basic insight in logotherapy: it is meaning that makes life bearable. 10
The stress of increased sensation, however, is at a biological level, and the "search for meaning" in the flood of information is a perfectly understandable mechanism for recovering internal equilibrium. The process of recognizing key patterns, such as the blind man experiences when he first learns to pick out visual objects, involves a cultivated insensitivity to all else that is irrelevant.

The process is familiar to anyone who has learned how to drive properly. The most important skill taught in driving school is not knowledge of laws and mechanisms but an art of controlled eye movement. The driver is faced with a faster flow of visual data than is the pedestrian, and this alone places special demands on his abilities. Moreover, the consequences of mishaps are greater at higher speeds. The pedestrian may take in the sights, but the driver must check his mirrors, the status of intersections, the spaces underneath parked cars. He must be attuned to the velocities of other vehicles and the intentions of other drivers. If you can remember your earliest driving experiences in heavy traffic, you may know what it is like to be a blind man recovering his sight. When we are immersed in an increased flow of information, we must control the flow in order to remain sane.

Although Aquinas did not treat the question of technology directly, he does offer some insights that can be extended to the phenomenon of technologically induced numbness. The mind becomes what it knows, but as we each have but one mind, at any moment the mind can be informed by only one perceived object. If we concentrate on the parts of a thing, then the whole recedes from our attention. If we concentrate on the whole, the parts are present to our mind only in a confused way. 11
In Thomas' museum of virtues and vices, we find caecitas mentis (blindess of the mind), and hebetudo sensus (dullness of the intellectual sense). Both vices are causes by sensual imbalance. In both cases, the cause is a disproportionate pleasure that draws one's attention to the pleasurable object, and this debilitates the intellect. 12
The penetrating power of the intellect, what we call insight, is compromised by overpowering pleasures, and it must be protected by asceticism. Plenus venter non studet libenter a full stomach does not incline us to study. Thomas speaks of the effect of the pleasures of food and sex, but it is easy to extend his reasoning to any overpowering stimulus. Physical pain and nonperiodic loud noises also impede our concentration.

We can apply the same to the engagement of attention required by any artifact that speeds up any human activity. Pleasure may impede the use of reason, when it distracts us from what the reason says we should seek. 13
An excess of pleasure can destroy prudential judgement, even if it leaves our ability to form theoretical judgements intact. On the other hand, delectatio, or delight and pleasure, can improve our performance. 14
Even a dog appears to derive pleasure from performing a task well. The pleasure that one gets from doing something well improves attention, as reasoning and reflex work together.

Since what we find pleasurable is largely a matter of habituation, all this can be applied to our practical relationship to our instruments. The inattention to vast areas of experience induced by our attention to a demanding technical task is not in itself bad. It is good and necessary. Yet if the mental habits are carried over into other areas where they are not appropriate, our habituation and pleasure work against reason. For example, the scholar has the Cartesian habit of being very cautious in coming to a conclusion. He must resist the attraction of rhetoric and verisimilitude. This habit is the basis of scientific rigor. No matter how many experiments or studies have suggested a conclusion, the next experiment could falsify it. If he carries that habit into daily life, he will be incapable of effective action, not to mention normal polite conversation.

It once occurred to me that driving could be made more pleasant if every driver was equipped with a radio that allowed him to converse with all the other drivers within visual range. Upon reflection, that would be dangerous. Communication between drivers is limited to a small number of light signals, and that is as it should be. These signals form the vocabulary of the driver. A greater vocabulary would create an information traffic jam. The driver is already putting stress on his natural ability to process information. As in other areas of life, the driver develops a selective numbness with regard to information. McLuhan writes:

The principle of numbness comes into play with electric technology, as with any other. We have to numb our central nervous system when it is extended and exposed, or we will die. Thus the age of anxiety and of electric media is also the age of the unconscious and of apathy.15


The computer is capable of handling precisely quantified information a much higher speed than the unaided human. The amount of information that can be compiled and processed by a computer is quite beyond the ability of any human to digest. For this reason, the most important feature in software design is the interface, which acts as the first buffer between the user and the data. The program takes the user step by step along a certain path into his data, and itself is bound with how the user will interpret that data. It guides him through an ocean of information and does the work of filtering that information and imposing pattern upon it.

Herein lies the weakness or danger in the use of computers, and this danger has an ethical dimension. The blind man who recovers his sight must do his own work when he learns to perceive limited patterns in an ocean of sensation. The computer user runs the risk of leaving to others the task of filtering reality. Important cognitive decisions are already made for him by the flow of the program. The computer excels at linear quantitative projections, but the task of perceiving qualitative change is not like using a computer program. It is more like building a new operating system from ground up.

Like the information collectors that Chesterton mentions, management by computer tends to limit our attention to selected information. There is the story of the mathematician and physicist Mitchell Fagenbaum, who owed a major discovery to the fact that he did not have a computer. He was doing calculations to model the events in various phase transitions, such as the transition from regular flow to turbulence, or the transition for liquid to gas. Although he worked at Los Alamos, he did not have the use of a computer and had to do repetitive calculations on a Hewlett Packard 65 programmable calculator. Ian Stewart, in his book Does God Play Dice?, writes:

This turned out to be a stroke of luck, because the calculator was so slow that its operator had time to think about the results as they emerged. Indeed before. The calculation began with an approximation of the required number and then improved it step by step. Now, the better the initial approximation, the less time the calculation took. So to save time an important consideration when you're using a calculator Feigenbaum starting trying to guess what the next number in the cascade might be. Soon he saw a pattern. The differences between successive numbers were in a constant ratio, each about four times as big as the next one.16


Mitchell Feigenbaum had made the first step in discovering a mathematical construct as important as PI. It is called an attractor, the value of which is 4.6692016090. This constant is of key importance in describing how pockets of order and regularity can arise out of processes that otherwise seem to tend to chaos. Perhaps corresponding examples could be found in any institution where people have come to rely on a computer system.

By speeding up the flow of information, computers increase the possibility of overlooking errors, and therefore make it more difficult to correct errors should they enter the information flow. As illustrated by the case of Mitchell Fagenbaum, the speedup of information also makes it easy to overlook important truth, because the processing of information already has criteria by which it dismisses some facts as irrelevant. The effect of errors at any stage is also multiplied. The first Pentium chips had an error which affected certain calculations in the ninth decimal place, yet some businesses reported that the figures provided by the calculations of their old 486s disagreed with their 586s by millions of dollars.

The response to this sort of problem is often to introduce redundant data input. Then we have the paradoxical situation of computers producing more and more work, and more and more paper, instead of delivering the promised reduction of work. Perhaps it is only the realm of gossip and songs in public houses that Chesterton mentioned, but time and time again I have heard it said that record-keeping was more serviceable before computers were involved. If Fagenbaum were not already famous, he deserves to be merely for the Fagenbaum syndrome, for all the data that goes nowhere because it is so quickly processed.

The effects of the computer on our cognitive habits has an analogy in the older technology of writing. Just as computers evoke awe in those who do not use them, so at one time the ability to use letters evoked awe. The word "glamour" is from "grammar", which is from the greek gramma, or letter. Does this mean that a glamourous model is one whose prepositions are not dangling?

Socrates related a myth of how an Egyptian god named Theuth invented the art of writing. He was quite pleased with himself, and thought that he had invented an elixir for memory. He presented his invention to the god Ammon, king of Thebes. Ammon disagreed. Theuth was too enamoured of his own invention to perceive its real effect. Where Theuth claimed that letters would aid memory, Ammon said that letters do not strengthen memory, but are merely a tool for reminding. The use of letters would produce forgetfulness in those who use them. Because they could commit their words to something external to themselves, they would not practice the use of memory. Writing, said Ammon, would also produce the appearance of wisdom wtihout the substance, since a word on paper cannot be interrogated but appears to be something fixed and stable. The written word would be bandied about by those who did not understand it.17


Aristotle taught that the written word is a sign of the spoken word, and the spoken word is a sign of what is in the mind or soul.18
The spoken word is dependent upon the perceptions that it signifies, otherwise it is mere noise. The written word likewise depends on the spoken word. To put it into the language of Marshall McLuhan, the spoken word is an extension of the word that is in the mind, and the written word is an extension of the spoken word. The relation is like that of a vine and its branches. If the extension is cut off from its source, it is dead.

This has application in all areas of life. The media of information tend to separate the processes of direct perception and the gathering of information from the processing and interpreting of that data. Those who check inventory in a business, or who evaluate students in an educational institution, often deal with factors that cannot be reduced to numbers. The person checking inventory may decide, for example, that something should not be counted because it might be damaged, or awaiting recall, but the forms that he fills out cannot possibly be so large as let him record all such facts. The people interpreting the data are separated from these perceptions.

We expect people to be capable of writing when they have little experience in conversation. From infancy, children are continually deprived of the rich auditory environment of former times, where they would be surrounded by conversation and myth. The school system, in its present form the product of a 19th century mentality that extended the techniques of large scale production to the raising of children, is based on the idea that the raising of children should be taken out of the hands of amateurs and run more efficiently by large scale programs and specialists. Students are formed passively in habits suited to obsolescent factories, rewarded for passive punctuality. Their habits of perception are also constrained. Like the record-keepers that Chesterton mentioned, they are trained to ignore their own perceptions and the spoken word and to repeat only what can be validated by reference to a written work.

The authority of the written work, however, rests largely on the economics of producing a printed work. To this day, a considerable outlay is required to produce a book by offset presses. In some cases, it is only worthwhile if the book will sell 10,000 copies or more. The book, then, acquires prestige because the economics forces publishers to be selective. If we assume that publishers are men of good judgement, only the best works are selected for publication out of economic necessity. This lends the book prestige. The book is authoritative, the spoken word is not, unless someone has recorded that word in print. I have found academics to be suspicious of the authority of citations based on sources in the World Wide Web and other Internet media. There, at virtually no cost, anyone can publish anything and make it accessible to millions of people in less time than it takes to go to the public library.

This is an example of how the quantitive speed up of information results in several qualitative changes. The authority of the published work, and of the published author, is a result of obsolete technology. The possession of academic credentials has no weight in the new media. Also, the compartmentalization of society, where education, entertainment and business exist as separate specialties, does not stand under the new conditions. The purveyor of advertising on the Internet must provide at least the promise of learning. The provider of scientific or historical information must think of the visual impact of his work, and so becomes an entertainer. An institution that commits itself to the new technology may be forced to redefine itself.

Of course, the electronic media will not make print disappear, any more than the print media made the spoken word disappear. The electronic media does recover something that was diminished by the printed page. Print technology put an end to scholasticism. Scholasticism was the communal approach to academic questions, where disputed questions were handled in a public manner, as men would haggle for a pig in the marketplace, although the scholastic method involved many levels of formality. With the age of print, scholasticism ended. The author would work alone, writing as if holding a private conversation with himself. The private point of view and individualism entered the world of ideas. Also, the economics of print over manuscript culture produced the copyright. To this day, the university lecture is an extension of the printed page, whereas in the times of scholasticism it was the reverse.

With the Internet, with the use of e-mail and Internet Relay Chat (IRC), the conversational mode of scholasticism is resurrected. Many are anxious that writers cannot effectively protect their copyrights, and it is a well-founded anxiety. A conversation where the parties are worried about whether their words will be stolen can only grind to a halt. By the internal necessity of the medium, the copyright mentality will become extinct on the Internet. Those who insist on the right to charge people for conversation will simply be ignored.

In other respects, the use of information technology can lead to a debasement of language. This has an analogue in the transition from oral tradition to written records. The spoken word is far richer in information than the written word. The simple proof of this is that a computer file containing three spoken words, as they are spoken, is several orders larger than an ASCII file containing the same words in the form of letters. The spoken word has nuances of intonation that must always be abstracted in the written word, and for this reason, the spoken word will always be more effective than the written word. What seems to be one written word does not correspond to some perfect unity called a spoken word.

The spoken word remains the privileged medium of our most important communications. The evidence of this is that fact that people go to trouble and expense to attend conferences at all, whereas it would be easier and cheaper to send written information either by post or e-mail. The written word is a much more compact vehicle for information than the spoken word, but this comes at a cost. The computer, with the automation of logic, holds the promise not only of directing communications, but also translating them into a universal medium or languge. Lest this seem far-fetched, let us consider that it is already being done.

The Automatic Teller Machine or the Point of Sale machine may provide an interface in any human language, but the information it processes is translated into an international language. The same process could be applied to any human communication. The basic principle to make it effective is that the computer system places limits on what we say, and merely provides alternative communications. This sidesteps the problem of translating ambiguous grammatical constructions and phrases. The simplest form is that in which a computer would present you with a choice of five frequent greetings. Your interlocutor, in his own language, would be presented with five possible responses to your greeting, and so on. The approach of the ATM machine could be applied to any human communication, by limiting the grammar and content through questions. It is obvious that we would lose a great deal, but given other choices that we have made as a society, we are likely to accept the drawbacks.

It is like the fast-food industry that has convenience as its main selling point. The convenience consists not in simply offering the consumer faster service, but in training him to act in a certain way, and limiting his options.

We are somewhat familiar with this approach in the use of grammar checking software. Grammar, however, is infinitely flexible. The rules of grammar, so-called, are a bookish reflection on something that already exists. What it is to say the right word in the right place cannot be defined by any set of rules smaller than the total collection of everything that is said by everyone. Any acceleration in information flow means that the complexity of language must somehow be rendered manageable. For example, the language of the Mandarins was very rich in vocabulary, and it was sustained by a class that had much leisure for study. With industrialization and mass culture in China, much of the vocabulary has been lost, perhaps permanently. I may refer again to the way in which automobile drivers communicate. A vocabulary of twenty thousand different signals would be unsuitable. Rather, drivers rely on a code of five or six signals to convey their intentions. The critics of modern culture often take a very reactionary stance to the impoverishment of language. Take the comments of Noam Chomsky:

Language is, after all, a tool for thought. If you debase the language, you debase the thought. I don't want to exaggerate the element of it, but it is one element, and one that's certainly consciously manipulated in order to introduce confusion and lack of perception.19


Chomsky, like Solzhenitsyn, perceives a fact that in mass culture language is simplified. Some of this may be a deliberate part of disinformation, as in political rhetoric. I think, however, that they miss an important point. A mass culture is a product of its means of communication and technology in general. The speed at which people live demands that communications be simplified. For example, if you go to McDonald's you are presented with a limited range of choices. You cannot request something with a little more mustard, or perhaps that the meat should be a little on the rare side. The vocabulary with which you interact with the restaurant is simplified, and nobody questions this. To think it could be otherwise would be like supposing that we should get rid of money as a common and simple measure of exchange, and institute one kind of coin for bread, another for beef, and so forth. Then the "language of money" would be enriched, but it would run counter to the whole purpose of money.

I accept it as a fact that the use of language must be adapted to the conditions imposed by our technology. It happens whether we agree to it or not. Yet Chomsky is correct when he says that the debasement of language results in a debasement of thought. For the first time, we are actually faced with the possibility that the language itself may become a corporate possession. Those who design the systems that we increasingly interact with have power to determine what tools we have for thinking. It happened with the development of literacy, and accelerated with development of printing. Rules of correct spelling followed the development of the printing press, which certainly placed the power of social communication with a certain class of people who made the rules. What we are faced with now is the rules of so-called expert systems which will limit our modes of expression along the lines drawn by programmers. Computers may overextend our capacity to communicate to the extent that the skills of spoken language will almost disappear.

I may draw on an example familiar to many people. A general practicioner after years of experience is capable of merely looking at a patient and arriving at a prudent judgement as to the illness and the cure. Any physician today who would rely on his immediate judgement would run the risk of a law suit. He must second-guess himself and submit the patient to a number of tests, and refer the patient to a specialist. All this takes time and costs money. As often as not, when the patient finally sees the specialist, he will get the same treatment that the general practicioner would have prescribed in the first place, although by then it may be too late. This is a typical case where our reliance on technology corresponds to a distrust of our own natural powers of judgement. The judgement of the general practitioner may be further compromised when medical expert systems become commonplace. Any expert system can only be a truncated version of some real individual's real accumulated knowledge, so this prospect cannot be viewed with optimism.

Information technology of any sort is a valuable extension of our natural powers of perception and reasoning, but when we rely on it exclusively, it has a debilitating effect. In the first stage when we are confronted with new technology, it absorbs all our resources of adaptation, and we tend to push it to its limits. It absorbs all our attention. If this remains the permanent attitude, the result is that we overextend our own natural powers, which the technology was meant to serve, and become the servants of the technology.


Send e-mail to:
hyoomik@vaxxine.com

NOTES





1. Aristotle, Metaphysics, I, i, 980b 25-30.

2.Aristotle, Parts of Animals, IV, x, 687a.

3. Marshall McLuhan and Quentin Fiore, War and Peace in the Global Village, McGraw Hill, NY and Toronto, 1968, p. 149.

4.G.K. Chesterton, Illustrated London News, Feb 14, 1931, in More Quotable Chesterton, ed. George J. Morlin, Richard P. Rabatin, and John L. Swan, Ignatius Press, 1988, p. 431.

5. Illustrated London News, Nov. 6, 1920, p. 431.

6.Aristotle Nichomachean Ethics, I, iii 1095 a-b; Parts of Animals I 639 a 1

7. Fr. Karol Wojtyla, Elementarz Etyczny (An Ethics Primer) Krakow 1979, Znak, a collection of articles that appeared in the Tygodnik Powszechny (Catholic Weekly( in 1957-58. My translation.

8. Nichomachean Ethics, VII, 1154b6.

9.Otto Lowenstein, The Senses, quoted in McLuhan, War and Peace in the Global Village, p. 10-11.

10.cf. Victor E. Frankl Man's Search for Meaning: An Introduction to Logotherapy Washington Square Press, NY, NY, 1963.

11.Aquinas, Summa Theologica, Ia, q. 85, a. 4 Utrum possimus multa simul intelligere, ad 3.

12. Summa Theologica, II-II q. 15, a. 3. Utrum caecitas mentis et hebetudo sensus oriantur ex peccatis carnalibus.

13. I-II q. 33 a. 3 Utrum delectatio impediat usum rationis.

14.I-II q. 33 a. 4 Utrum delectatio perficiat operationem.

15. Marshall McLuhan, Understanding Media, McGraw Hill, NY, 1964, p. 56.

16. Ian Stewart, Does God Play Dice? The Mathematics of Chaos, Basil Blackwell Inc. Oxford, 1989, p. 197.

17. Plato, Phaedrus, 274-275.

18. Aristotle, Peri Hermeneias, I, 16a.

19. Noam Chomsky, Chronicles of Dissent, interviews with David Barsamiam, Common Courage Press, Maine, 1992, p. 43.