With regard to the mind, I believe, and have defended, the Sartrean thesis that consciousness has no content. There is no such thing as mental content. Therefore, given one or two more plausible premises, I am committed to the Wittgensteinian claim that the word is a totality of content (i.e. facts) not things (if we don’t think mental content, we must think worldly content).
For this reason, I am committed to thinking of intentional directedness towards the world as a form of revealing activity, broadly understood. I am the subject of intentional states to the extent that I entertain worldly content in a certain way (credulously, desirously, emotively, and so on). And my endorsement of embodied and extended cognition follows directly from this – revealing activity often straddles neural, body and environmental processes. (All this is discussed at much greater length in my recent book, The New Science of the Mind: From Extended Mind to Embodied Phenomenology (MIT Press, 2010).
My view that consciousness is, in the above sense, empty leads me to at least look favorably on no-self views of the sort associated with Buddhism and Derek Parfit – and therefore, also the ethical consequences of these views (no absolute distinction between one’s own suffering and that of others, etc).
Intentionality (capability to represent) or consciousness? Which phenomenon is, in your opinion, more complex and fascinating?
I do not think you can separate the two. At its core, I think intentionality should be understood as disclosing or revealing activity. Consciousness is a type of disclosure. So, there is no question of one being the more complex or fascinating. Two types of disclosing or revealing activity: causal and constitutive. Sub-personal cognitive processes disclose the world causally in the sense that they provide a physically sufficient condition for the world to be revealed in a given way to a subject. For example, if Marr’s theory of vision were true, then certain processes that begin with the retinal image and conclude with a 3D object representation would be physically sufficient for the world appear a certain way to a subject. These processes would not be logically sufficient. The subject might be a zombie, and so on. But the processes do form a physically sufficient condition. Conscious experiences, on the other hand, disclose the world constitutively in the sense that they provide a logically sufficient condition for the world to be revealed in a given way. Thus, if an experience with a certain phenomenal character – a certain what it is likeness – occurs, then that is logically sufficient for a certain portion of the world to be revealed in a given way (even if the experience is illusory or hallucinatory). Consciousness is, fundamentally, constitutive disclosure of the world, and as such is a species of intentionality.
Will machines think?
AVANT Volume III, Number 1/2012