by Norman Desmarais
At a time when a sizable portion of the computer industry is working on artificial intelligence and its possibilities, I suggest we take a few moments to turn our attention to "artificial ignorance." This is particularly appropriate because ignorance precedes knowledge, philosophically.
In this paper, we shall discuss some attitudes toward the computer and how the technology may block access to public information. We shall also examine the assumption that computers can't make mistakes and how it brings about computer-dependency.
Many of us have grown up with a mystical view of the computer. We consider it smarter than humans. In a sense, it's a "black box"; we put something in and something else comes out. We key in data and it produces an analysis faster than any human can. Unfortunately, some of us may come to rely on it too much without giving it a second thought. We fail to realize that data input depends on humans. Errors in keying or programming produce incorrect results ("Garbage in = garbage out") which we often blame on the computer. We also make false assumptions that lead to false results.
Consider the case of a department head in a Big 8 accounting firm who figured his next year's budget using the most sophisticated computer and spreadsheet program on the market but misplaced a decimal point while entering his figures. Not one of the hundreds of CPAs and MBAs who read his budget spotted the blunder in time, and, as a result, the firm had to sacrifice its advertising budget for an entire year. The firm shipped the CPA to an outpost where it hoped he couldn't do any more damage.
When we use programs, most of us accept them at face value. We fail to realize that many of them appear on the market without adequate testing. The programmer may make some assumptions that are not documented or communicated to the user who may have a different set of expectations. Or, in the course of revising or updating software, a bug in one part of the program may be corrected. Other parts which depend on the revision may get overlooked and produce errors when none would have occurred previously.
Many of us buy sophisticated programs for a variety of applications. How many of us bother to read the documentation to master the software? We often use it at a basic level and don't bother to learn its built-in short-cuts or time-saving features such as macros.
Our skills of analysis and common sense may be on the wane and too much reliance on computers may be diminishing critical thinking. How many of us have seen patrons "forget" the existence of alternate sources of information (print or microform) after becoming used to computer searching? The computer has become the first and only stop in the search for information.
Computers can make us lazy. We no longer take into consideration information not available through the computer. As long as we have enough material, we're satisfied. We don't make the extra effort to search for more complete information or to verify citations. We accept "good enough" and perhaps produce shoddy results.
Nor do we read the entire document any longer. A quick perusal of an abstract may put us on the track of important and relevant articles, yet this is analogous to relying on Cliff Notes instead of an actual reading of a novel or play.
An information society requires greater ability to analyze data critically--not just access to it. Useful information consists of correct data that has relevance and purpose. In addition to having the information, we need to realize its value and this requires analytical capabilities. It takes reflection and understanding to convert it into a body of knowledge.
The same technology that makes it so easy to obtain and process information could inhibit access to that information according to the Office of Technology Assessment's report Informing the Nation: Federal Information Dissemination in an Electronic Age
The Freedom of Information Act (FOIA, 1966) was written before the widespread use of computers to store information. Under the current FOIA, the public can get information if an agency staffer can retrieve it with "reasonable effort." This leaves fundamental access decisions to the agency. The courts may provide further interpretation if the matter appears in litigation.
Court decisions involving computer records apply analogies with paper documents. This implies a distinct boundary between what constitutes record and nonrecord material. The courts currently base the delineation of this boundary on the function of retrieval: if information requires new programming for its retrieval, there is no record. It is much easier to apply this kind of functional definition than other distinctions, but it may be inappropriate. If someone takes this decision to its extreme, one could interpret it to mean that pushing a button to print a document constitutes new programming.
In a time when government agencies store most of their data in machine-readable form, an interpretation such as the one just described could result in an information blackout. Citizens would receive little or no information because somebody has to press a few keys to retrieve it from the computer. Thus, we have a subtle shift that detaches decisions about reasonableness from any consideration of effort. This does not agree with traditional practices. Government agencies have historically spent a significant amount of time and effort searching for and producing paper documents under the Freedom of Information Act. Currently, retrieval of these paper documents may involve extensive tracking, communication with numerous bureaus, searching disparate files, and substantial hand deletion of exempted materials.
When we think that computers can't make mistakes, we assume that output is right even when input is wrong. In the days of mechanical cash registers, some owners made clerks check register stubs against purchases before bagging. This double-checking could catch errors resulting from erroneous keying or reading. Now, with computerized cash registers, we do not see any double-checking. This may come from pressure to process as many transactions as quickly as possible. Maybe managers ignore the possibility of error or they build a certain percentage of the cost into the sale price to cover losses of this type. This translates into higher costs for consumers.
Humans are fallible. They have short attention spans. Eyes misread numbers and fingers hit wrong keys. In the absence of double-checking data entry, anyone who uses a computerized cash register (and that soon will include every store clerk in America) must know how to estimate to determine whether or not to search for an error. Undetected keying errors result in overcharging customers or lost revenue to merchants. Many stores have installed optical scanning devices to eliminate such problems and to save labor costs.
Consumer associations and labor unions see disadvantages in laser scanning and bar codes. The removal of price tags from individual items makes "comparison shopping" more difficult and could lead unscrupulous store-owners to raise prices surreptitiously. Any reduction in "price consciousness" could work against the consumer and shift the balance in favor of the retailers. Joint economic committee of Congress found that U.S. supermarkets had overcharged customers by $660 million in 1974.) Many states have forced stores to reinstate price tags--which of course reduces the cost benefits of the system. In the UK, however, a special report from the Office of Fair Trading on the impact of new technology in retailing concluded, in 1982, that no new laws to protect the consumer were necessary.
In 1990, the situation has not improved. Joseph Ferrara, assistant director of food inspection services for the state Department of Agriculture and Markets reported that New York state inspectors checked 33 supermarkets with automatic checkout scanners in 1989. They found 32 of them overcharged their customers. He said that more than 5 percent of the items in a typical store are mispriced in favor of the supermarket.
When the computer is down, we become unable to function. On the other hand, computer manufacturers try to develop computers that are so simple to operate that any idiot can use one. They try to make them seem easier to use than copy machines. Yet, if idiots operate the terminals, the rest of us are in big trouble.
Take Dan Gutman's case, for example. He was going to Florida on a business trip and wanted to book a hotel room. Since he wanted a particular hotel chain, he called their 800 number to locate their nearest hotel to Stuart, Florida as he knew they didn't have one in Stuart. The clerk responded that the information didn't show up on the computer.
Gutman then asked if they had any hotels in the state of Florida. "Yes, many of them" came the answer; but the clerk could not determine the closest one to Stuart. If she did not have the screen in front of her, she'd probably think, "Hey, I'll pick up the map of our hotel locations in Florida and maybe there's one near Stuart." He then asked if they had any hotels in West Palm Beach which is just 35 miles south of Stuart. "Of course" came the response. Yet, the clerk did not provide the information sooner because she "didn't have that information in the computer."
Gutman concludes that, "I'm coming around to the old obsolete view that computers rot the mind. I'd like to think this woman was stupid BEFORE she used the computer, but who can tell? Maybe the computer is turning us into a nation of morons who don't know anything unless it appears on a screen in front of our faces. I'm not sure if computers are making people stupid or if they're making it possible for stupider people to get better jobs."
Yet, computer literacy may not provide an adequate answer. Joe Weizenbaum, MIT Professor of Computer Science, considers "computer literacy" to be "pure baloney." He says that people who use a computer only for the applications never need to learn how the technology works. He is also concerned about the phenomenon of computer "nerds" or junkies--kids who become addicted to their machines and have little time for fellow humans. "These unfeeling morons", he says, "come to think it's possible to reduce a human relationship to a print-out or to solve a moral question by bits and bytes."
Margaret Boden voiced a similar opinion in Artificial Intelligence and Natural Man. "One has to consider the insidiously dehumanizing effects of people's becoming decreasingly dependent upon direct human contact with their fellows for satisfaction of their various needs. In any highly mechanized society, consumption of the goods produced may become less convivial, as technology enables us to do more things without leaving the house and mixing with other people (for instance, drawing water or being entertained). Thus far, production of goods has become more socialized than in past times when the typical productive unit was the 'cottage-industry'; but many people who today can only do their jobs by going to a particular place of work, might tomorrow be able to stay at home and communicate with their clients and coworkers via the home terminal. The socially isolating influence often attributed to television is as nothing to the alienation and loneliness that might result from over-enthusiastic reliance on the home terminal and associated gadgetry."
Sometimes, computers may just make us look stupid. For users of word processors, this often comes as a result of deferring proofreading to the computer. Spelling checkers often overlook typographical errors when the erroneous word is spelled correctly but results in another word such as "pare" and "pear." They don't alert to redundant words and phrases nor do they detect omitted data that results in a loss of meaning.
As we move into the information age, we will depend even more on computers to monitor complex operations like nuclear power plants, flying aircraft, and moving money around. We need to be assured that the computers, their operating systems, and their software operate correctly and that not just any idiot is at the controls.
"Information monopolies have always been a source of power ... It takes years to become skilled in these fields, but expert systems may make the information available to ordinary mortals who spend far less time studying than experts have had to. The democratization of information may change our whole concept of professional services, or it may once again prove that a little knowledge is a dangerous thing."