Information Technology, Responsibility, and Anthropology - CiteSeerX

1 downloads 14959 Views 295KB Size Report
anthropological assumptions in information technology and some appear to be ... BA 67) is a purely ethical, that is to say reflective, position. Similar ideas can be ...
Proceedings of the 35th Hawaii International Conference on System Sciences - 2002

Information Technology, Responsibility, and Anthropology Bernd Carsten Stahl University College Dublin, Department of Management Information Systems [email protected], www.bc-stahl.de

Abstract Information technology is one potential object of responsibility as we know from texts about computer ethics, information ethics etc. This article aims to demonstrate that the ethical problems of computers and information technology frequently result from anthropological assumptions. The interaction of humans and computers is more than just the use of a value-neutral tool. The use of information technology changes the way we perceive ourselves and the way we can discharge our responsibilities. There are positive and negative sides to this anthropological question of responsibility for IT. Computers increase our reach and multiply the amount of information at our disposal. We can know more about more subjects and communicate with more people than ever before. On the other hand communication using information technology is systematically distorted. If we neglect to take this into account responsibility for and through IT can lose its legitimacy.

1. Introduction The purpose of this paper is to call attention to an area of information technology that is usually overlooked. My thesis is that the use (especially the commercial use) of IT in general and of computers and information systems in particular has serious ethical implications, some of which are caused by anthropological views and suppositions. Neglecting to take this into account, so my thesis continues, endangers the legitimacy and thus the acceptance of the use of information systems. IT in the wide sense of the word, ranging from telephones over computers to the Internet, is a crucial success factor in today's economy. In times of globalisation, e-commerce, and an ever increasing number of new business models it appears to be impossible for most businesses to survive or prosper without IT. The reality of our businesses today is that most steps of production or service are aided, organized or even done by computers and information technology. Therefore the topic of this paper is situated somewhere between business ethics and computer and information ethics.

In order to prove my point that IT is of relevance to ethics I will concentrate on the notion of responsibility as a central model for moral philosophy. Responsibility is certainly not the only possible ethical model for this purpose but it is a good starting point because it is frequently used in business. On the other hand responsibility is especially suitable to demonstrate some of the ethical shortcomings of information technology. Several of these ethical problems are based on anthropological assumptions in information technology and some appear to be intrinsically linked to it. In the end I hope to prove that there are some ethical problems inherent to information technology, which every theory of professional ethics has to take into account. To get there, however, we will first have to define the notions and work out the relationships between them.

2. Business Information Technology and Ethics Finding a precise definition of what information technology is would probably be a difficult endeavour. If one includes communication technologies then the scope of possible theories and artefacts contained in information technology becomes hard to delimit. One possible definition can be found in Mason et al. (1995, 80) stating that information technology is "the tangible means by which information is manipulated and carried to its ultimate users."1 Mason et al. continue by pointing out some further ingredients modern information systems contain. Those are "hardware, software, people, data, and procedures - designed to deliver services intended to improve a social System" (ibid). Business information technology (BIT) then is that part of information technology that is used in businesses. A definition borrowing heavily from Langford (2000, 227) suggests that business information technology is "any activity using information technology which is undertaken by business". And while it may be difficult to give an exact definition of 1 The notion of IT that I use here exceeds that of information systems. Most of my argument, however, concentrates on computers and information systems, which is the reason why I hope the paper is suitable for the HICSS.

1 0-7695-1435-9/02 $17.00 (c) 2002 IEEE

Proceedings of the 35th Annual Hawaii International Conference on System Sciences (HICSS-35’02) 0-7695-1435-9/02 $17.00 © 2002 IEEE

1

Proceedings of the 35th Hawaii International Conference on System Sciences - 2002

business information technology from a technical point of view I believe that our common sense understanding of the term will mostly suffice for the purposes of this paper. This same common sense knowledge of BIT allows us a first glimpse of why it might be ethically relevant from a viewpoint of business ethics as well as from a viewpoint of computer ethics. Whether BIT is in fact the backbone of modern business may in some instances be doubted but it is certainly true that "everyone working in the businesses of today’s developed world experiences -- and is affected by computers in virtually every aspect of their working lives" (Langford 1999, 3). If we want to ascertain this first suspicion that BIT is ethically relevant we next have to shed some light on the subject of ethics. In this paper I want to follow the "German tradition" of applied ethics that makes a distinction between ethics and morality along the lines of theory and practice of moral philosophy. Morality in this model stands for the stock of existing and factually recognized norms whereas ethics is the philosophical discipline that reflects the actual multiplicity of moralities (Steinmann / Löhr 1994, 8). Morality is a notion of little controversy since it does nothing than contain the norms and rules that govern our social interaction. Ethics can be described in the widest sense to be the reflection of morality (Bechmann 1993, 214). Others are more affirmative and believe that ethics has to provide us with answers or at least arguments about which morality is the "right" one (Hastedt 1994, 57; Siep 1993, 288). This distinction between ethics and morality between theory and practice, between doing and reflecting is a constant feature of German moral philosophy and can be traced back to Kant whose categorical imperative (Kant 1995, BA 67) is a purely ethical, that is to say reflective, position. Similar ideas can be found in most of the German literature about ethics, especially in modern thoughts about applied ethics such as medical ethics, business ethics, computer ethics etc. The reason why I follow this tradition apart from my personal provenance is that it allows a more precise use of the notions, which would be impossible if one equated ethics and morality as is often the case in Anglo-American literature (Hausman / McPherson 1996, 222; Weckert / Adeney 1997, 1). While it is rather obvious that ethics and morality have something to do with the realm of business this is far less clear in the case of information technology. Even the biggest sceptic of business ethics must admit that factual morality of the economic subjects has relevance for economic decisions and for markets. Whether it is in fact true that the "Social Responsibility of Business is to Increase Its Profits" (Friedman 1970) or whether we believe that businesses have some kind of moral role to play, moral preferences of consumers or producers are relevant because they matter in the bottom line. My hypothesis is now that IT also has a high ethical relevance but in many cases this is due to reasons that are not

directly obvious and that in spite of this, or maybe because of this, ethics has to pay closer attention to IT as a fact of business life. Before we come to the heart of this argument, however, we should take a look at some of the more commonly recognized ethical problems of BIT. Information technology in general and its application in business can lead to a number of ethical problems that have led to the development of an academic discipline often called computer ethics or information ethics. One purpose of ethics, as we have seen, is the description of morality. Morality is often understood as having the objective of facilitating human life and coexistence. It can thus be understood to be a tool for solving interpersonal conflicts and for optimising social life. Our ways of perceiving, managing and solving conflicts alter necessarily with the advent of IT and so does our ability to be good citizens or contribute to a good society, whatever that may mean. Briefly, "this puts information use squarely in the field of ethics" (Mason et al. 1995, 4). Without being able to discuss these points here in more detail it should be obvious that information is inherently ethically charged. Without information there would be no life, at least no human life, and there would not even be the possibility for a need for ethics. One branch of information ethics is computer ethics. This branch is more restricted in its subject because it leaves aside a large portion of questions related to information and instead concentrates on ethical problems related to computers. The term was coined in the mid 1970 by Walter Maner who "began to use the term "Computer Ethics" to refer to that field of applied professional ethics dealing with ethical problems aggravated, transformed or created by computer technology" (Bynum 2000, 37). In this sense computer ethics is a notion denoting a special field of attention for ethics similar to business ethics, medical ethics, technical ethics etc. While some of the authors differ as to whether computer ethics will die out in the near future (Johnson 2000, 25) or whether it will become the focal point and lone survivor of ethics (Bynum 2000, 51f), for the moment it is a small but recognized field of ethics. One of the main differences between information ethics and computer ethics is that computers, contrary to information, are generally not seen as inherently ethically charged. Computers only become ethically relevant because of their ubiquity in modern society and because we cannot avoid their use in our everyday moral dilemmas. Often this sentiment is expressed in phrases like this: "Ethics is about human conduct, and the use and development of computer technology is part of human conduct." (Weckert / Adeney 1997, ix) This, to me, seems to be one of the central mistakes that, due to its status as an axiom of computer ethics, leads to wrong or incomplete conclusions. But before we discuss this let us take a brief look at the important problems of computer ethics as they are discussed today.

2 0-7695-1435-9/02 $17.00 (c) 2002 IEEE

Proceedings of the 35th Annual Hawaii International Conference on System Sciences (HICSS-35’02) 0-7695-1435-9/02 $17.00 © 2002 IEEE

2

Proceedings of the 35th Hawaii International Conference on System Sciences - 2002

The ethical problems of IT are usually divided into several categories. Hauptman (1999) for example sees the problems of privacy, fraud, monitoring and surveillance, censorship, integrity, and overload. Apart from these general categories one can try to discern problematic areas for certain parts of IT, the most important of which today is certainly the Internet. Here again one can find categories or orders of problems such as those suggested by Langford (1999, 108) who sees data piracy, publication of inaccurate or deliberately erroneous material, inappropriate linking, trademarks and domain name ‘passing off’, and wider issues as moral problems of the Internet. The connection between these problems and business ethics has also been made and De George concludes that "There are […] some areas in which the introduction of the computer into business either raises familiar problems in a somewhat different way or raises new problems […]: (1) computer crime; (2) responsibility for computer failure; (3) protection of computer property, records, and software; and (4) privacy of the company, workers, and customers. Three basic concepts emerge in many of the cases in these areas and will require special attention: information, privacy, and property." (De George 1999, 329)

All of these problems and others have been discussed intensively and are still under investigation by many researchers. We will therefore leave them aside as long as they do not directly support my underlying thesis that one of the central reasons for the ethical problems of BIT is located in its anthropological assumptions. 2

3. IT and Anthropology Anthropology is the "science" of the human being and it constitutes one of the fundamental pillars of philosophy. From the beginning of philosophy in ancient Greece up to today anthropology has always had a close connection to ethics. It is plain to see that the nature of man determines the way men interact and the rules of this interaction; that is what we call morality in this paper. Furthermore the nature of man also determines the way we think about ethics and our general ability to justify or criticise morality, which is what ethics stands for. It is obvious that a concept of human beings as primarily social beings (Aristotle's zoon politicon) leads to another concept of a 2 For the sake of completeness I have to mention that I do not claim that IT is only negative for man and for ethics. There are doubtlessly many positive aspects of IT that have led to its widespread use and that also lead to the improvement of human life. Since my argument aims at ethical problems, however, I will leave it with this remark and continue with ethical problems. Some of the advantages of IT for ethics will be named in the last section of this paper.

community than the view of humans as primarily selfcentred and anti-social. In order to find out how the anthropological implications of IT affect responsibility we first have to give some thoughts to these implications and their roots. Information technology must have anthropological foundations. If computers and technology are made for the use of humans then there need to be assumptions about these humans and their capacities. Some of these assumptions are so obvious that to mention them may seem trivial. The use of a keyboard, for example, implies the existence of hands and fingers and for a monitor to make sense the user must be equipped to see and understand the symbols displayed on it. There are other assumptions or implications that are less obvious and that are, maybe for that reason, important for the impact of IT on ethics. One description of human beings is that they are incomplete when compared to other species (Gehlen, 1997). Man's natural equipment seems to be lacking the necessities that almost any other species has. We have neither fur nor weapons; we can neither run quickly nor swim well nor fly. In order to survive human beings had to rely on their intellectual capacities. Since survival would have been impossible in most environments without the help of thought and consequently of tools man has often been called the tool-making animal. His being unspecialised has often been perceived as man's weakness but it can also be seen as his strength. It can be argued that only our lack of specialisation enables us to the adaptation to many environments and that it made us the winner of evolution that we are (Eibl-Eibesfeld 1997, 244). Either way, man has the capacity of developing and using tools to change his environments according to his needs and wishes. A commonly found interpretation of IT is now that it is nothing but a certain sort of tool, albeit a sophisticated one. Technology as the continuation of tools extends human reach and substitutes for human action. It helps us achieve our goals and it can even be seen as the incarnation of our Western culture (Collste 2000, 181). Technology gives man new power and this power is what is most commonly seen as the ethically relevant point of BIT. Before we come to this we should try to dig a little deeper into the relationship of technology and anthropology and try to understand how the concepts of man and machine sometimes become confused or even fused.

3.1 Man and Machine The interpretation of the relationship of man and machine offers two extreme positions, namely that of man as machine and that of machine as man. The first one, man as machine, is an interpretation that can be found since the beginning of modern philosophy (Leukip,

3 0-7695-1435-9/02 $17.00 (c) 2002 IEEE

Proceedings of the 35th Annual Hawaii International Conference on System Sciences (HICSS-35’02) 0-7695-1435-9/02 $17.00 © 2002 IEEE

3

Proceedings of the 35th Hawaii International Conference on System Sciences - 2002

Demokrit) and is connected to such well-known names as Descartes or Hobbes. In accordance with the technological ideas of his time Hobbes sees man as a mere mechanical apparatus (Hobbes 1994). One can state that the interpretation of man as a machine has changed with the machines used by mankind. The modern equivalent of Hobbes's man as a mechanical puppet is therefore the model of man as an information processor. There are several aspects of man and computer that suggest this comparison. Human beings, in order to survive, have to process information. If we were not able to receive information, filter the relevant pieces and act accordingly we would not be able to survive. To some extend this is true for every living being but it may be especially true for us because our capacities of processing information are higher than those of most or all other animals. Another point may be the abstract and symbolic nature of the information we can process. Unlike animals we can transform most of the information we receive into symbols like language or writing and then work with it on an abstract level. In the end we can translate the results back and they have a considerable impact on our environment or on us. On the other hand there are relevant differences between computers and humans, some of which will be discussed later. It does seem clear that our society in general and business in particular are moving towards the often invoked information society. Without bothering too much what this notion stands for one can conclude that this change will have consequences for the members of this society. Some authors conclude that for the information society we even need a new conceptualisation of mankind is needed. This new archetype is then called the "information person". "Information persons are primarily givers, takers, and orchestrators of information. They live and work in a period in which most people have close contact with information technology and in which all of us are affected by it." (Mason et al. 1995, 24) This idea is close to the idea of man as information processing machine, it is the application of that concept to the developments of society. Another possible conception of the relationship between man and machine other than the modelling of man according to the capacities of a machine is the attempt to reverse this association, to model machines as humans. While this was easy to reject when our technology was still mechanical it is now easier to conceive. Computers are becoming increasingly interactive. This new way of interacting with users combined with new approaches to computers such as expert systems or artificial intelligence may lead us to believe that the computer actually is more than a machine. This raises the question whether machines can think. Alan Turing put forward the famous test according to which the capacity of thinking could be measured in terms of the

capacity of communicating (Bollmann / Heibach 1998, 324). The relationship of man and machine, especially IT, of course has more aspects than the ones mentioned so far. It would for example be possible to look at the age-old question of body and soul from the perspective of IT. If we say computers are indistinguishable from humans then we might ask whether they have souls, which again leads to a whole host of mostly theological problems (Weckert / Adeney 1997, 146). For our purposes it will suffice to concentrate on the more manifest aspects described so far.

3.2 BIT, Ethics, and Anthropology Ethics and morality both are based on assumptions about the nature of humankind. Many ethical theories are based on the "fundamental recognition that human life is vulnerable and fragile, defenseless and abandoned, mangled and suffering. Ethics emerges from our confrontation with this inescapable fact of human existence" (Mendieta 1999, 116).3 According to Habermas (1991, 14), morality can be understood as a safeguard that has the purpose of compensating the vulnerability that is a structural part of our socio-cultural form of life. One philosophical concept with ethical relevance based on anthropological assumptions is that of the person. It is worth noting in this paper because the person is often seen as the subject of responsibility and ethics. The notion of a person descends etymologically from the Latin persona, which originally denoted the roles of a theatre play (French 1992, 134). From there it evolved to describe the sides of a legal dispute and then it became a moral notion. According to Kant (1990, 53 f.) a person is the subject whose actions can be attributed. It follows from this that the person has to be the autonomous being, which follows the laws given by itself. Today there are three different parts that constitute personhood in our Western tradition: metaphysical, moral, and legal concepts (French 1991, 133f). Most important in this context is the moral side, the fact that the person is the centre of moral ascriptions (Flynn 1984, 10). Before we return to the question of the person under the heading of responsibility, we should take a brief look at some other aspects of the connection of ethics, BIT, and anthropology. As already mentioned, the aspect most commonly named as a reason for ethical considerations of IT is its use as a tool. Like all tools IT increases range and scope of humans' power over their environment and about fellow humans. "Power is the ability to achieve one's goals, and in an information society information replaces weaponry and monetary wealth as the principal source of power. The exercise of power always raises ethical issues." (Mason et al. 1995, xvi) All uses of power are 3

This quote refers specifically to Dussel's ethics but it is applicable to a number of other ethical concepts.

4 0-7695-1435-9/02 $17.00 (c) 2002 IEEE

Proceedings of the 35th Annual Hawaii International Conference on System Sciences (HICSS-35’02) 0-7695-1435-9/02 $17.00 © 2002 IEEE

4

Proceedings of the 35th Hawaii International Conference on System Sciences - 2002

ethically charged because they affect others in their (moral) rights and obligations. Since modern technology allows us to affect more people than ever before with decreasing effort it has ethical implications. The use of IT as a tool brings disruptions to many traditional social settings, thus threatening "existing distributions of power, money, rights, and obligations" (Laudon / Laudon 1999, 453). What is worse, not only the actual use of IT leads to an increase of power, the mere consideration of its use and the subsequent rationalisation of processes and structures have the same effect (Johnson 2000b, 90). All of this I want to call the "tool-approach". It is held together by the thought that only the use of an otherwise ethically neutral device leads to its ethical relevance. Representative for the tool-approach is Johnson when she says: "[...] it is not exactly computer technology that poses or could pose an ethical issue but rather what the technology makes possible for human action and interaction." (Johnson 2000, 26)

There are, however, views that see IT as inherently ethically relevant independent of the use it is put to. An important reason for this kind of argument is the anthropological idea of man as a machine. If one subscribes to this identification then it is no longer important to observe moral rules since those apply only to humans, not to machines. "Once human and machine are said to be indistinguishable, our previous moral commitments are easily cast aside in the name of progress. Conduct previously prohibited becomes permissible simply because technology makes it possible." (Kerr 1999, 241) Some other important ethical considerations are also based on the idea of the equality of human beings and information processing machines. These considerations can be summarized under the heading of "dehumanization". This stands for a variety of ways that computers, information systems, or modern technology in general takes away some of the central and defining characteristics of mankind. According to Johnson (2000, 19) this was one of the first ethical problems discussed in relation to IT. Especially the fear that computers might take over our most important activities such as decisionmaking is the background to this sentiment. It is reflected in popular culture in literature and films about computers becoming independent of humans and taking over the ruling of the world. Why would we see this as dehumanizing? Apparently we take pride in our supposedly unique capacity of making judgments. If computers can do the same thing this lessens our selfrespect. Information systems trying to duplicate these human abilities such as expert systems or artificial intelligence thus have a potential for being dehumanizing. The problem does not stop there. Computers taking over human tasks not only affect our self-consciousness

but they may even physically replace humans, thus making them redundant. While this may be positive in the case of hard or dangerous labor it may have devastating effects on whole industries and their employees. The blurring of the distinction between man and machine can also lead to physical or psychological illness. A lack of ergonomics can induce repetitive stress syndrome but more importantly, there is a new work related illness called technostress. It is stress induced by computer use and has symptoms including aggravation, hostility towards human, impatience, and fatigue. This is caused by humans' continuously working with computers and thus coming "to expect other humans and human institutions to behave like computers, providing instant responses, attentiveness, and an absence of emotion." (Laudon / Laudon 1999, 476) Privacy, one of the frequently discussed topics of computer and information ethics, also has an anthropological side to it. Human communication, which is at least an important part of what it means to be human and may even be the base of reality, is severely handicapped by invasions of privacy. It is one of the assumptions we make when we communicate that we have complete power over who is privy to this communication. IT and especially its uses and abuses in business call in question whether that is still so in the information age. There are many possible ethical arguments against the invasion of privacy, such as the argument that it is a sign for the use of man as a mean instead of an end but there is also an anthropological side to it. "Privacy, or personal freedom, is the basis for selfdetermination, which is the basis for self-identity as we understand it in American society." (Severson 1997, 65) After having established the link of BIT, ethics, and anthropology I will in the final step of my argument focus on the notion of responsibility and demonstrate that the use of IT leads to increased problems in this area.

4. IT and Responsibility Why does it make sense to further complicate the discussion by introducing another notion and why do I not leave it at the end of the last section? There are several reasons for taking the notion of responsibility into account. First of all, responsibility makes it easier to communicate questions of ethics and morality in a business setting. While many of the actors of business from top level management to the worker in the assembly hall will quickly conclude that business ethics is a contradiction in terms, the same is not true for responsibility in business. Every manager or worker will agree that she is responsible for her work and that there are a lot of responsibilities in economic life. The same is true for technical communities that tend to avoid ethics but will recognize the importance of responsibility. This

5 0-7695-1435-9/02 $17.00 (c) 2002 IEEE

Proceedings of the 35th Annual Hawaii International Conference on System Sciences (HICSS-35’02) 0-7695-1435-9/02 $17.00 © 2002 IEEE

5

Proceedings of the 35th Hawaii International Conference on System Sciences - 2002

positive aspect of the notion of responsibility leads to a negative aspect, to its ambiguity. The ubiquity of the notion in today's discourse has led to the fact that it has no binding meaning. When talking about responsibility it is therefore necessary to say a few words about the notion before drawing any conclusions. Another reason for the discussion of responsibility is that some of my arguments become clearer when rephrased in terms of responsibility. Finally there are some structural similarities between business, information systems and responsibility that facilitate the translation from one area to the other.

4.1 Responsibility Responsibility is a term stemming from the legal sphere where it stands for the response that the accused has to give to the judge. This etymological root, the answer, can be found in the English respons[(e)ibility], the French respons[(e)abilité] or the German (Ver)Antwort(ung). The stem of the word points in the direction of communication, which will be the central thrust of the following argument. Another important aspect is that responsibility is not a fact of nature but a social construct with the aim of ascribing something to someone. Due to a lack of space I will omit a more detailed discussion of the notion of responsibility and just try to show three features that most ascription of responsibility share. Those common features are openness, closeness to action, and consequentialism. Openness stands for the property of responsibility of not giving clear instructions how to behave, which is what many traditional moralities do. Openness of responsibility is due to the fact that it is a communicative notion and that it aims at ascribing an object to a subject. While there may be meta-rules for that there are no direct rules for our behaviour involved (Kaufmann 1995, 88). Closeness to action means that the use of the word responsibility always promises some manifest results. If we say that somebody is responsible for something, be it good or bad, we imply that this responsibility will have an effect. Consequentialism, lastly, is the consideration of results as the basis for an ethical judgement. It can also be interpreted as a kind of teleology, which in this case means that some idea of the "good life" is the motivation for moral action (Ropohl 1987, 157). All three of these points bear a close proximity to business life and combined they are one possible explanation for the popularity of the notion of responsibility.

4.2 IT as Support of Responsibility Before we proceed to the additional problems for responsibility induced by the use of new technologies in businesses we should for the sake of a balanced argument

look at the additional advantages. Ethically motivated responsibility is a social construct, which aims at ascribing results to agents. Its most basic requirements are the ability and the willingness to communicate by all affected parties. This is where the aims of IT coincide with the aims of responsibility. In this sense the two concepts are structurally similar. They not only help people to communicate but they also aim at the creation of a shared reality. Information systems have as one of their important points the task of informing people of what is happening and what is real. Management information systems are an example. They tell management what the status quo is, which in return serves as the base for further decisions. Responsibility, on the other hand, requires exactly this kind of accord. In order to ascribe responsibility, the affected parties not only have to share moral norms but, maybe more importantly, they have to share the same reality. IT can be helpful in this respect. Also, IT and its use in business allow for formerly unimaginable increases in range and scope of communication. It is no problem for a user of the Internet to acquire information about all kinds of facts or activities that used to be inaccessible. Therefore IT can help responsibility develop in areas where it might not have been a useful concept before. On the other hand it increases the burden of information on the individual. We can no longer pretend we do not know what happens in the third world or other places of global problems. As Gorinak-Kocikowska puts it: "for the first time in the history of the earth, ethics and values will be debated and transformed in a context that is not limited to a particular geographic region, or constrained by a specific religion or culture." (Gorinak-Kocikowska, quoted in Bynum 2000, 49) The positive effects of the spread of information are not confined to responsibility but can have political implications as well. This argument can be put as follows: "if democracy means power in the hands of the many, if information is power, and if the GII [Global Information Infrastructure, BCS] puts information in the hands of the many, then it follows that the GII means (supports, leads to) democracy." (Johnson 2000b, 98) Furthermore, one can even conclude that the increasing use of BIT even leads to the necessity of responsibility and thus promotes its development. "[…] developing new expectations of appropriate electronic behavior is essential, but such development can only be effective through emphasis on personal responsibility and appropriate education." (Langford 1999b 73) Whether this is really so would have to be debated and this kind of conclusion looks very much like naturalistic fallacy. However, there do seem to be numerous ways in which IT might further the assumption or discharge of responsibility. This discussion will have to wait for another occasion since I now want to concentrate on the intrinsic problems that BIT holds for responsibility.

6 0-7695-1435-9/02 $17.00 (c) 2002 IEEE

Proceedings of the 35th Annual Hawaii International Conference on System Sciences (HICSS-35’02) 0-7695-1435-9/02 $17.00 © 2002 IEEE

6

Proceedings of the 35th Hawaii International Conference on System Sciences - 2002

4.3 BIT as a Problem for Responsibility In the next part of this last section I want to take a brief look at some of the problems, which the use of IT will bring to responsibility (a) before I shift my attention to those problems of responsibility that are specifically caused by the anthropological assumptions and problems of IT (b). (a) One open question in relation to the use of communication technology and responsibility is the question of accountability. Accountability can be defined as the social side of responsibility, as "a feature of systems and social institution: It means that mechanisms are in place to determine who took responsible action, who is responsible." (Laudon / Laudon 1999, 457) If we accept this definition then it is easy to see how the increasing use of IT may lead to problems in this area. Interconnection often makes it difficult to trace the origin of one particular piece of information. Collste (2000c) uses the example of medical consultations via the Internet to demonstrate this point. More generally it can be said that the tracing of information and thus the ascription of accountability and responsibility may become more difficult due to IT. Another problem is the legal aspect of responsibility. So far we have concentrated on moral responsibility but that in many cases is closely related to the legal sort. Legal responsibility faces a whole number of new problems caused by the increased use of IT. Laws are by their nature slow entities to change and in the rapidly changing world of information and communication technology lawmakers are often hard pressed to keep up with the relevant aspects. But even if one particular legislative body manages to do so that may turn out to be useless because of the international nature of today's technology. The Internet is again the best example for this. If a French person reads something on the Internet that is illegal in Germany, that was written by an Italian and that is stored on an American server, but routed through German systems, then this produces numerous legal problems, which I do not even pretend to be aware of. Apart from many further detailed problems that might be discussed here such as the question whether IT really helps democracy or does not in fact hinder it (Breen 1999), the last general point I want to make now is the problem of access. Even without too much ethical theory it seems plausible that a responsibility ascription can only retain its moral status if all parties affected are involved. IT can be a big plus in this respect since it allows more people to interact on certain topics than ever before. The downside, however, is that any attempts to discharge responsibility is doomed to failure if it relies on IT and not all of the affected parties have access to IT. And this leads to the general problem of access to new technologies. We already divide the world into information rich and

information poor, which generally correspond to financially rich and poor. The attempt of realizing responsibility via IT will lead to the exclusion of the information poor, who are the financially poor, who are mostly excluded already. A moral notion like responsibility may thus lead to immoral effects. (b) Let us now come to the last point of my argument, to the problems that the use of information technology causes problems for responsibility due to its anthropological assumptions. The first and in my mind most important point is that of a structural weakness of communication via electronic means. This structural weakness can best be explained on the basis of a theory of communication. The theory of communication that this paper is based on is that of Jürgen Habermas. Since it is impossible to adequately reflect on this extensive theoretical structure I can only point to the relevant literature (Habermas 1981) and try to spell out the most important parts as I regard them as relevant for this topic. For Habermas communication is aimed at the coordination of the behaviour of different people. He states that every piece of communication implies three validity claims (Geltungsansprüche), namely truth, rightness, and veracity. That means that every sentence we say implies that the speaker holds the stated facts for true, holds the stated norms for valid and means what she says. This, of course, does not mean that we always speak the truth, accept valid norms or mean what we say. It only means that we have to rely on these assumptions if communication is to be fruitful. All three of these validity claims, in order not to remain in obscurity or idiosyncrasy, can be challenged by every participant of the discourse. The discourse is another normative fiction, which comprises everybody who has something to say to a certain topic and which is without temporal or spatial limits. All of these ideals, even though unattainable in real life, play an important role for our factual communication. They also lead to consequences in different areas of philosophy such as ethics (Habermas 1991), philosophy of the law (Habermas 1998) or sociology. Habermas' ideas lend themselves perfectly to an interpretation of responsibility because of their emphasis on communication. We have seen that communication is the theoretical and practical root of responsibility and therefore the ethical but also the sociological results of Habermas theory of communicative action go well with it. This is certainly the reason why in the German field of applied moral philosophy, especially in business ethics, Habermas has a large amount of followers who more or less directly try to derive their ideas from his theories (e.g. Ulrich 1997) The central idea of doubting and justifying validity claims as a means to coordinate human action leads to many difficulties when transformed from theory to practice. The questions might be: What are the relevant norms, who is affected, what is the relevant piece of

7 0-7695-1435-9/02 $17.00 (c) 2002 IEEE

Proceedings of the 35th Annual Hawaii International Conference on System Sciences (HICSS-35’02) 0-7695-1435-9/02 $17.00 © 2002 IEEE

7

Proceedings of the 35th Hawaii International Conference on System Sciences - 2002

reality and how do we determine it, when do we end the discourse? etc. While this is already true for any sort of responsibility ascription it becomes much worse in case we attempt to do the same via IT. Our new media make it harder to even discern which validity claim is contested. While in a person-to-person communication it may be possible to determine the truth, that is to say those propositions that are accepted by all participants of the discourse, the same can no longer be said for technologically enhanced communication. The advantage of linking otherwise unacquainted people turns into a disadvantage if the respective life-worlds of these people is so different that even the simplest validity claims become contested. For a successful communication, though, one needs consensus; at least consensus about the points of dissent. This seems to become much harder to come by in the case of communication via IT. Apart from the differing life-worlds there are other difficulties that the medium of technology poses to communication. One of these problems may be described under the heading of the "Other". The Other in this case is the other person with whom the moral agent wants to or has to interact. This idea of the other as the central point of ethics had a strong influence on the French ethics of the 20th century from Sartre to Levinas. Even though there are great differences between these theories of ethics their common point is that the other induces ethical responsibilities on the moral agent. For Sartre it is the fact that the agent recognizes an entity similar to herself in the Other. For Levinas, on the other hand, the Other signifies the otherness as what is completely alien to the agent. His countenance then forces responsibility on the agent. Whether one subscribes to these views or not, it is easy to see that the Other is important for ethics. Morality interpreted sociologically as a means to the end of group survival is grounded in human interaction in small groups. This kind of interaction again finds its most important realisation in the interaction with other individuals, which can therefore be seen as the basis of ethics. Technological interaction leads us to lose the other from sight and thus touches the anthropological roots of morality. "[…] the electronic environment can make us lose sight of the fact that there is a person at the other end of the machine." (Gannon-Leary 1999, 172) The other loses his face, even his reality, and this of course reflects back on morality and responsibility. It is difficult to envisage the duty to answer to somebody whom one cannot even imagine. This situation may be aggravated by technological measures to stop certain kinds of communication such as censorship. While censorship may be justified, it holds a hidden danger caused by the anthropological blindness of technological communication. The danger is that we may lose possibilities of communication that we are not even aware of. Many of the programs used for blocking out certain parts of the Internet, for example, do so on the ground of algorithms that are generally unknown to the

user. The fact that such programs tend to block out names such as Sussex shows that they are error prone. While this is no principal problem, it can become serious when it is unknown what part of the information is suppressed. Another observation is that technological means are not able to reproduce all relevant aspects of communication. In order for communication to be successful as a means of coordinating people all of its aspects must be understood. Written communication like Internet or email cannot reflect the nonverbal part, which, according to some authors, stands for up to 65 percent of communication (Mills, quoted after Weckert / Adeney 1997, 114). Yet another difficulty for the ascription of responsibility caused by anthropological questions of IT is the relationship of responsibility and the person. We have seen that the notion of the person is often used to denote the human being as a moral agent. It is a philosophical construction that tries to capture some of the facets of human life, which go beyond simple and simplistic definitions of man.4 The property of being a person is often seen as a precondition for the participation in the process of responsibility ascription. Some authors go so far as to say that responsibility and person are two concepts that are dependent on one another and that are necessary for a comprehensive description of humanity (Neuberg 1997, 2). "We can assign responsibility only to persons." (Baier 1972, 50) One of the many aspects of this relationship between responsibility and person is the question of body and mind. For Velasquez (1991) moral responsibility has to originate in the agent and the origination (dependent on mens rea and actus reus) can only come from human beings as having a mental and bodily unity. Computers and IT in their actual form cannot be persons, therefore they cannot be responsible in the moral sense of the word.5 Their lack of personhood makes it more difficult to convey information necessary for responsibility. The complete lack of the bodily side of being human and being responsible leads to problems worth being further explored. Another anthropological problem is the perception of man as a machine, which leads to further problems for responsibility than the lack of personhood of machines. We have seen that human vulnerability, physical and more importantly social vulnerability, can be seen as the reason for morality and ethics. If we see man as a machine this sense of vulnerability is lost and with it we lose the reason for acting morally and thinking ethically. Also, a machine that does not work any more can be repaired or has to be 4

A simplistic definition of humans would for example be the ridiculous reduction of man to his genes that seems to follow the publication of the genome. The notion of a person aims to comprise more aspects of man than just the scientifically observable. 5 Should computers ever progress to the form we find in science fiction the question might have to be reconsidered. The android Data in Star Track for example is for a pertinent discussion.

8 0-7695-1435-9/02 $17.00 (c) 2002 IEEE

Proceedings of the 35th Annual Hawaii International Conference on System Sciences (HICSS-35’02) 0-7695-1435-9/02 $17.00 © 2002 IEEE

8

Proceedings of the 35th Hawaii International Conference on System Sciences - 2002

replaced. Treating human beings this way is highly immoral. There are many who believe that today's societies are drifting in this direction, not least of all because of the impact of IT. Man as a machine also does not need rest, does not become ill, does not have social and personal wishes etc. On the other hand the oftenbemoaned information overload or new diseases like techno stress seem to be the result of this view of humans. We tend to treat people like information systems, which have a high input capacity and can be upgraded if need be. By doing so we often do not recognise people's needs and capabilities. To make the last point in my argument that anthropological assumptions based on IT lead to new problems for responsibility, I have to borrow from another field of philosophy, from metaphysics. Metaphysics as the "science" of being has close ties to anthropology as well as ethics. On the one hand computers reduce reality to highly structured and formalized models that can be stored and processed by computers using mathematical algorithms. This leads to the problem already discussed that many important aspects of communication are blended out. "Computers do not know how to handle data that are related to extraordinary events or individual persons, or issues that cannot be quantified or coded." (van Bemmel 2000, 147). On the other hand computers create new realities (for example via virtual realties but also on a more mundane level) and they control the access to these realities (Britz 1999, 13). This in itself is a problem requiring that responsibility be assumed for but it can also become a problem for the assumptions of responsibility. The multiplicity of realities increases the difficulties for moral agents to find out what reality is relevant. Without a consensus about this, however, ascriptions of responsibility are doomed to failure.

4. Conclusion Computers, information systems, the use of information technology in and out of businesses lead to new chances and to new risks for the assumption of responsibility. This summary of this article may not be too surprising. The analysis of the problem did show, however, that while the advantages on the whole seem to be rather superficial, the problems run deeper. The interaction of man and machine leads to problems that cannot be easily overcome. There are suggestions as to how to deal with some of the problems of the use of IT in businesses. These often tend in the direction of some formal approach. On the one hand we have codes of conduct, of which there is an evergrowing number for the use of IT. On the other hand there is the approach via legal procedures. If my argument in this paper is right, such formal approaches can at best be partial solutions.

Responsibility has been shown as an integral notion for the transport of moral and ethical questions in businesses. As such any ethicist can probably approve of the attempt to increase the scope of responsibility in business. On the other hand this article shows clearly that there are limits to the increase and the acceptability of responsibility. If these limits are recognized and taken into account then there is a good chance that the use of IT can not only further business processes but also moral issues. Otherwise there are possibly severe pitfalls that may turn ethical intentions to unethical results.

References Baier, Kurt (1972): Guilt and Responsibility. In: French, Peter (Ed.) (1972): Individual and Collective Responsibility Massacre at My Lai. Cambridge, Massachusets: Schenkman Publishing Company 35-62 Bayertz, Kurt (ed.) (1993): Evolution und Ethik. Stuttgart: Reclam Bechmann, Gotthard (1993): Ethische Grenzen der Technik oder technische Grenzen der Ethik? In: Geschichte und Gegenwart. Vierteljahreshefte für Zeitgeschichte, Gesellschaftsanalyse und politische Bildung 4, 12. Jahrgang Dezember 1993: 213-225 Bechtel, William (1985): Attributing Responsibility to Computer Systems. In: Metaphilosophy Vol 16, No. 4 October 1985: 296-305 Birnbacher, Dieter (1995): Grenzen der Verantwortung. In: Bayertz, Kurt (ed.) (1995): Verantwortung: Prinzip oder Problem? Darmstadt: Wissenschaftliche Buchgesellschaft 143 - 183 Bollmann / Heibach (ed.): Kursbuch Internet - Anschlüsse an Wirtschaft und Politik, Wissenschaft und Kultur. 2nd edition, Reinbek bei Hamburg: Rowohlt Taschenbuch Verlag Breen, Marcus (1999): Counterrevolution in the Infrastructure A Cultural Study of Techno-Scientific Impoverishment. In: Pourciau, Lester J. (ed) (1999): 29 - 45 Britz, J. J. (1999): Ethical Guidelines for Meeting the Challenges of the Information Age. In: Pourciau, Lester J. (ed) (1999): 9 - 28 Bynum, Terrel Ward (2000): Ethics and the Information Revolution. In: Collste, Göran (ed.) (2000): 32 - 55 Collste, Göran (ed.) (2000): Ethics in the Age of Information Technology. Linköping: Centre for Applied Ethics Collste, Göran (2000b): Ethical Aspects of Decision Support Systems for Diabetes Care. In: Collste, Göran (ed.) (2000): 181 - 194 Collste, Göran (2000c): The Internet-Doctor. In: Collste, Göran (ed.) (2000): 119 - 129 De George, Richard T. (1999): Business Ethics. 5th edition Upper Saddle River, New Jersey: Prentice Hall Eibl-Eibesfeld, Irenäus (1997): Der Mensch - das riskierte Wesen Zur Naturgeschichte menschlicher Unvernunft. 3. München Zürich: Piper

9 0-7695-1435-9/02 $17.00 (c) 2002 IEEE

Proceedings of the 35th Annual Hawaii International Conference on System Sciences (HICSS-35’02) 0-7695-1435-9/02 $17.00 © 2002 IEEE

9

Proceedings of the 35th Hawaii International Conference on System Sciences - 2002

Flynn, Thomas R. (1984): Sartre and Marxist Existentialism: The Test Case of Collective Responsibility. Chicago, London: The University of Chicago Press French, Peter A. (1992): Responsibility Matters. Lawrence, Kansas: University Press of Kansas French, Peter A. (1991): The Corporation as a Moral Person. In: May, Larry / Hoffman, Stacey (eds.), (1991): Collective Responsibility: Fife Decades of Debate in Theoretical and Applied Ethics. Savage, Maryland: Rowman & Littlefield Publishers Inc. 133 - 149 Friedman, Milton (1970): The Social Responsibility of Business id to Increase Its Profits. In: The New York Times Magazine, September 13, 1970 Gannon-Leary, Pat (1999): The Ethics of Email. In: Pourciau, Lester J. (ed) (1999): 165 - 190 Gehlen, Arnold (1997): Der Mensch. Seine Natur und seine Stellung in der Welt. Stuttgart: Uni Taschenbuch Habermas, Jürgen (1998): Faktizität und Geltung: Beiträge zur Diskurstheorie des Rechts und des demokratischen Rechtsstaats. 1. Auflage Frankfurt a. M.: Suhrkamp Habermas, Jürgen (1991): Erläuterungen zur Diskursethik. Frankfurt a. M.: Suhrkamp Habermas, Jürgen (1981): Theorie des kommunikativen Handelns. (2 tomes) Frankfurt a. M.: Suhrkamp Hastedt, Heiner (1994): Aufklärung und Technik. Grundprobleme einer Ethik der Technik. Frankfurt a. M.: Suhrkamp Taschenbuch Wissenschaft 1141 Hauptman, Robert (1999): Ethics, Information Technology and Crisis. In: Pourciau, Lester J. (ed.) (1999): 1 - 5 Hausman, Daniel M. / McPherson, Michael S. (1996): Economic analysis and moral philosophy. Cambridge et al.: Cambridge University Press Hobbes, Thomas (1994): Leviathan. London: Everyman Johnson, Deborah G. (2000): The Future of Computer Ethics. In: Collste, Göran (ed.) (2000): 17 - 31 Johnson, Deborah G. (2000b): Is Democracy Embedded in the Internet?. In: Collste, Göran (ed.) (2000): 89 - 103 Kant, Immanuel (1995): Kritik der praktischen Vernunft, Grundlegung zur Metaphysik der Sitten. Frankfurt a. M.: Suhrkamp Taschenbuch Wissenschaft Kaufmann, Franz-Xaver (1995): Risiko, Verantwortung und gesellschaftliche Komplexität. In: Bayertz, Kurt (ed.) (1995): Verantwortung: Prinzip oder Problem?. Darmstadt: Wissenschaftliche Buchgesellschaft 72 - 97 Kerr, Ian R. (1999): Mind Your Metaphors. In: Pourciau, Lester J. (ed) (1999): 231 - 251 Langford, Duncan (2000): Ethical Issues in Business Computing. In: Collste, Göran (ed.) (2000): 225 - 235 Langford, Duncan (1999): Business Computer Ethics. Harlow et al.: Addison-Wesley Langford, Duncan (1999b): Beyond Human Control - Some Implications of Today's Internet. In: Pourciau, Lester J. (ed) (1999): 65 - 75 Laudon, Kenneth C. / Laudon, Jane P. (1999): Essentials of Management Information Systems. 4th edition London et al.: Prentice Hall

Mason, Richard O. / Mason, Florence / Culnan, Mary J. (1995): Ethics of Information Management. Thousand Oaks, London, New Delhi 1995: SAGE Mendieta, Eduardo (1999): Review essay: Ethics for an age of globalization and exclusion. In: Philosophy & Social Criticism vol 25 no 2.: 115 - 121 Neuberg, Marc (1997): La responsabilité - questions philosophiques. Paris: Presses Universitaires de France Pourciau, Lester J. (ed.) (1999): Ethics and Electronic Information in the 21st century. West Lafayette, Indiana: Purdue University press Ropohl, Günter (1987): Neue Wege, die Technik zu verantworten. In: Lenk, Hans / Ropohl, Günther (eds.) (1987): Technik und Ethik. 2. Auflage 1993 Stuttgart: Philipp Reclam jun. 149 - 176 Severson, Richard J. (1997): The Principles of Information Ethics. Armonk, New York / London: M. E. Sharpe Siep, Ludwig (1993): Was ist Altruismus?. In: Bayertz, Kurt (ed.) (1993) Ulrich, Peter (1997): Integrative Wirtschaftsethik - Grundlagen einer lebensdienlichen Ökonomie. Bern, Stuttgart, Wien: Haupt van Bemmel, Jan H. (2000): Protection of Medical Data. In: Collste, Göran (ed.) (2000): 145 - 158 Velasquez, Manuel (1991): Why Corporations Are Not Morally Responsible for Anything They Do. In: May, Larry / Hoffman, Stacey (eds.), (1991): Collective Responsibility: Fife Decades of Debate in Theoretical and Applied Ethics. Savage, Maryland: Rowman & Littlefield Publishers Inc. 111 - 131 Weckert, John / Adeney, Douglas (1997): Computer and Information Ethics. Westport, Connecticut / London: Greenwood Press

10 0-7695-1435-9/02 $17.00 (c) 2002 IEEE

Proceedings of the 35th Annual Hawaii International Conference on System Sciences (HICSS-35’02) 0-7695-1435-9/02 $17.00 © 2002 IEEE

10