Artificial Consciousness

2 downloads 0 Views 15KB Size Report
In the 1992, the term 'artificial consciousness' was used for the first time in a scientific context (Aleksander, 2000). Since then no consensus has emerged.
Artificial Consciousness Vincenzo Tagliasco, Riccardo Manzotti DIST - Universiy of Genoa, Viale F.Causa 13, 16145 Genova [email protected], [email protected] In the 1992, the term ‘artificial consciousness’ was used for the first time in a scientific context (Aleksander, 2000). Since then no consensus has emerged among scientists about if and when such an achievement will be possible. The experienced properties of the conscious mind – among which unity, representation and being in relation-with – are not shared by any known physical system. As Jaegwon Kim wrote “We are not capable of designing, through theoretical reasoning, a wholly new kind of structure that we can predict will be conscious; I don’t think we even know how to begin; or indeed how to measure our success” (Kim, 1993). This is the necessary conclusion to be reached if the physicalistic reductionistic objectivistic framework is accepted. Given these premises, it is impossible to build an artificial system that allows a conscious mind to occur. Following such premises not even our biological brains could produce anything like our conscious experience. A theory of mind must propose a new framework and a way to use such a framework in order to explain biological brains and to design artificial brains. Existing approaches fail on one or other of these two constraints. For instance Edelmann, Aleksander do not propose any new ontological frameworks (Aleksander, 1994, 2000; Edelman, 1987; Edelman and Tononi, 2000), while Davidson, Chalmers and others do not propose any new design approachs (Chalmers, 1996; Davidson, 1980). We have presented a different framework that is capable of endorsing a structure that, in its ultimate realization, will allow a conscious mind to occur (Manzotti, 2001; Manzotti and Tagliasco, 2001). We called this framework The Theory of The Enlarged Mind (TEM for short). According to TEM every conscious event corresponds to a counterfactual relation with those events that constitute its content. By ‘counterfactual relation’ we mean a particular kind of causal relation between occurrences of events. In short, the rationale for this claim is that between events there is a particular kind of relation; it is the foundation for the intentionality of mental events, and is perceived as a counterfactual relation. An artificial conscious system is a system that is capable of letting events occur in a counterfactual fashion with external events. A moment of consciousness corresponds to an event that has a counterfactual relation with all those events that are part of the content of that moment of consciousness. A number of issues crop up. The first is that artificial consciousness is necessarily an embodied mind: the content of a conscious mind must be those events that are counterfactually related to it. The second is that a conscious system must grow inside the same environment that will be perceived consciously. In other words, the mind is a structure of counterfactually related events. This structure enlarges itself each time the conscious mind perceives something new so the mind is a product of ontogenesis. Thirdly, there is no a priori reason why this collection of counterfactually related events cannot take place thanks to an artificial structure instead of a biological one.

References Aleksander, I. (1994). Towards a Neural Model of Consciousness. Proceedings of ICANN 94. Aleksander, I. (2000). How to Build a Mind. London: Weidenfeld & Nicolson. Chalmers, D. J. (1996). The Conscious Mind: in Search of a Fundamental Theory. New York: Oxford University Press. Davidson, D. (1980). Essays on Actions and Events. Oxford: Clarendon Press. Edelman, G. M. (1987). Neural Darwinism. The Theory of Neuronal Group Selection. New York: Basic Books. Edelman, G. M., and Tononi, G. (2000). A Universe of Consciousness. How Matter Becomes Imagination. London: Allen Lane. Kim, J. (1993). Supervenience and Mind. Cambridge: Cambridge University Press. Manzotti, R. (2001). Intentional robots. The design of a goal seeking, environment driven, agent. PhD Thesis, University of Genoa, Genova. Manzotti, R., and Tagliasco, V. (2001). Coscienza e Realtà. Una teoria della coscienza per costruttori e studiosi di menti e cervelli. Bologna: Il Mulino.