Abstract
Integrated Information Theory (IIT) posits a new kind of information, which, given certain constraints, constitutes consciousness. Searle objects to IIT because its appeal to information relies on observer-relative features. This misses the point that IIT’s notion of integrated information is intrinsic, the opposite of observer-relative. Moreover, Searle overlooks the possibility that IIT could be embraced as an extension of his theory. While he insists that causal powers of the brain account for consciousness, he maintains that these causal powers aren’t tied to protoplasmic material. Whatever these causal powers are (Searle doesn’t offer a positive account), they don’t consist in mere information-processing or computation. IIT agrees, and also positively characterizes the relevant causal powers as those involved in generating integrated information. Examining the further commitments of each theory reveals that IIT renews a fundamentally ontological challenge to information-processing and computational theories of mind.
Similar content being viewed by others
Notes
My thanks to two anonymous referees, and to the editors, for helpful and constructive comments.
What exposition is offered here is only enough to motivate the discussion of IIT’s relation to Searle and its implications, and is not intended to be either justificatory or reductive. My hope is that this is adequate for a general readership, but for less dense exposition, see Fallon (2016, 2018) as well as various papers cited below.
Phi measures only the quantity of integrated information; the organization of the causal dynamics determine its nature. Structures of very different organization can generate the same amount of phi, while still specifying different information qualitatively. So, integrated information, while measured by phi, is not identical to it, and subsets of integrated information that are conscious are measurable by phi, but also are not identical to their phi values. (It should be noted that it is pragmatically impossible to calculate phi values precisely for complex systems; an ongoing project within IIT involves refining methods of approximating it.)
While it is possible to imagine a variation on IIT that refrains from identity claims (e.g. List 2018, Section 6), Tononi (personal communication) has confirmed that he maintains the commitment to such claims made explicit throughout the IIT literature. This paper will treat the canonical IIT position.
Peressini (2013) offers an expressly philosophical critique; the main claim there involves arguing that IIT focuses on the qualitative nature of experience and not on consciousness as such (see also Cerullo 2011). The current paper will bracket this issue, instead addressing IIT as it describes itself, relating it to further debate.
Searle seems ambiguous on whether consciousness or mere mentality is necessary for intrinsic information. We have seen Searle (2013a, b) claim that information presupposes consciousness. Searle (1992, 156) claims that unconscious mental states are still intrinsically intentional. Because intentionality is a form of information, this is in tension with the claim that information presupposes consciousness. Perhaps the latter is the dominant interpretation of Searle: Manson, e.g., (2003, 140) states: “Searle insists that only conscious states have intrinsic intentionality”. Searle (1992, 226) describes unconscious mental states as “dispositional,” which may support this reading.
Elsewhere (1998, 43), Searle allows characterization of conscious states by reference to access, but makes clear that this access is not epistemic, but a matter of the “inner character” (i.e. intrinsic nature) of experience.
In a very similar context, Searle writes “The aim of the Chinese room example was to try to show this” (1981, 368).
For Searle, the relationship between intentionality and consciousness is especially close. Only the mental is intrinsically intentional, and consciousness is the crucial factor: consciousness is by its nature intentional, and non-conscious mental states are intentional only by virtue of their connection to consciousness (specifically, their capacity to become conscious). Anything outside the mental (if it has intentionality at all) only has intentionality derivatively, i.e. it depends upon the (conscious) mind for its intentionality. See Section 4, Note 8, and Section 6 for how this fits into Searle’s overall ontology.
One might use the word “mysterious” to describe IIT’s posited new physical law, but the term does not apply in the same way as it does to Searle’s position. All brute physical facts are ipso facto mysterious. IIT argues for the identity claim, given the rest of the framework of IIT, as an inference to the best explanation. Whether or not the abductive justification ultimately succeeds, these are far deeper resources than Searle’s mere invocation of the brain’s causal powers.
Piccinini (2007, 94) relates Searle’s claim to pancomputationalism: While pancomputationalism claims that everything is a computing system, Searle’s claim is stronger: that “everything implements every computation”. For the historical background of pancomputationalism, as well as a response to it, see Copeland (1996). For a response to Searle, see Melnyk (1996), as well as Copeland (1996). Putnam (1988, 120–125) makes an argument very similar to Searle’s. My thanks to David Chalmers for pointing out this similarity. See Chalmers (1996, 2011) for a reply to Putnam, similar to Melnyk’s response to Searle; see Egan (2012) for commentary on Chalmers.
Jaworski (2016) defends an ontology that includes fundamental non-structured material, and structure; the latter is paradigmatically exemplified by living things, and has emergent causal powers not explicable by reference to material.
Incidentally, the exclusion postulate may be a plausible constraint for conscious existence, ruling out, as it does, conscious mice-groups and the like, but it is not clear why the MICS’s “claim to maximal existence” is necessary for existence as such. It would seem that even non-MICS integrated information satisfies the conditions for intrinsicality, because it involves a system making a difference to itself.
It is also the exact opposite of the project Koch engaged in for years, in regular cooperation with Francis Crick, attempting to find the neuronal correlates of consciousness (NCC) in an attempt to explain experience. Koch’s endorsement of IIT is a remarkable departure.
IIT’s rather dramatic ontological step may be viewed as serving to avoid the circularity charge that Page (2004) levels against Searle’s comparatively pedestrian ontology. (Dramatic as such a step might be – at least in contemporary Anglophone cognitive science – it is not alone: the recently concluded project New Directions in the Study of Mind (at Cambridge; for which Tim Crane was Principal Investigator) avowed a skepticism of physicalism.)
At an NYU workshop (November 2015) entitled “The Integrated Information Theory of Consciousness: Foundational Issues,” Tononi confessed that Koch sometimes jokes that he (Tononi) is secretly a Berkeleyan.
Larissa Albantakis (personal communication)
Edelman and Tononi (2000, 7 and 48) offer some points intended to critique it; Edelman (1989) in remarks perhaps related to later work with Tononi, dismisses functionalism, but confines his remarks to machine-state functionalism only. Koch (2012) glosses functionalism as non-explanatory in the same way that the identification of the neural correlates of consciousness (NCC) does not explain consciousness.
Functionalism here does not just include machine-state functionalism (even if IIT literature only addresses the latter).
References
Aaronson, S. 2014a (May 21). Why I am not an integrated information theorist (or, the unconscious expander) [Web log post]. Retrieved from Shtetl-Optimized, http://scottaaronson.com/blog.
Aaronson, S. 2014b (May 30, June 2). Giulio Tononi and me: a phi-nal exchange. [Web log post]. Retrieved from Shtetl-Optimized, http://scottaaronson.com/blog.
Cerullo, M. 2011. Integrated Information Theory: a promising but ultimately incomplete theory of consciousness. Journal of Consciousness Studies 18: 45–58.
Chalmers, D. 1996. Does a rock implement every finite-state automaton? Synthese 108: 309–333.
Chalmers, D. 2011. A computational foundation for the study of cognition. Journal of Cognitive Science 12: 323–357.
Chalmers, D. 2016. Panpyschism and protopanpsychism. In Panpsychism: contemporary perspectives, ed. G. Bruntrup and L. Jaskolla. Oxford: Oxford UP.
Copeland, J. 1996. What is computation? Synthese 108: 335–359.
Dennett, D. 1980. The milk of human intentionality. Behavioral and Brain Sciences 1980 (3).
Edelman, G. 1989. The remembered present: A biological theory of consciousness. New York: Basic Books.
Edelman, G., and G. Tononi. 2000. A universe of consciousness: How matter becomes imagination. New York: Basic Books.
Egan, F. 2012. Metaphysics and computational cognitive science: let’s not let the tail wag the dog. Journal of Cognitive Science 13: 39–49.
Fallon, F. 2016. Integrated Information Theory of Consciousness. The Internet Encyclopedia of Philosophy.
Fallon, F. 2018. Integrated information theory. In The Routledge Handbook of Consciousness, ed. R. Gennaro. New York: Routledge.
Hofstadter, D. 1981. Reflections. In The mind’s eye: Fantasies and reflections on self & soul, ed. D. Hofstadter and D. Dennett, 373–382. New York: Basic Books.
Jaworski, W. 2016. Structure and metaphysics of mind: how hylomorphism solves the mind-body problem. Oxford: Oxford UP.
Koch, C. 2012. Consciousness: Confessions of a romantic reductionist. The MIT Press.
Koch, C. & Tononi, G. 2013. Can a photodiode be conscious? New York Review of Books (7 March 2013). Retrieved from http://www.nybooks.com/articles/2013/03/07/can-photodiode-be-conscious/. Accessed 21 June 2018.
List, C. 2018. What is like to be a group agent? Nous 52 (2).
Manson, N. 2003. Consciousness. In John Searle, ed. B. Smith. Cambridge: Cambridge UP.
Melnyk, A. 1996. Searle’s abstract argument against strong AI. Synthese 108: 391–419.
Oizumi, M., Albantakis, L., & Tononi, G. 2014. From the phenomenology to the mechanisms of consciousness: integrated information theory 3.0. PLOS Computational Biology. https://doi.org/10.1371/journal.pcbi.1003588
Page, S. 2004. Searle’s realism deconstructed. Philosophical Forum 35 (3).
Peressini, A. 2013. Consciousness as integrated information: a provisional philosophical critique. Journal of Consciousness Studies 20 (1–2).
Piccinini, G. 2007. Computational modelling vs. computational explanation: Is everything a Turing machine, and does it matter to philosophy of mind? Australasian Journal of Philosophy 85: 93–115.
Putnam, H. 1988. Representation and Reality. Cambridge: MIT Press.
Pylyshyn, Z. 1980. The ‘causal powers’ of machines. Behavioral and Brain Sciences 1980 (3).
Rescorla, M. 2015. The computational theory of mind. In, The Stanford Encyclopedia of Philosophy (Spring 2017 Edition), ed. E.N.. Zalta, URL = https://plato.stanford.edu/archives/spr2017/entries/computational-mind/.
Ringle, M. 1980. Mysticism as a philosophy of artificial intelligence. Behavioral and Brain Sciences 3: 417–457.
Schopenhauer, A. 1969. The world as will and representation, trans. E.F.J. Payne. Dover Books.
Searle, J. 1980. Author’s reply. Behavioral and Brain Sciences, 1980(3).
Searle, J. 1981. Minds, brains and programs. In The mind’s eye: Fantasies and reflections on self & soul, ed. D. Hofstadter and D. Dennett, 353–373. New York: Basic Books.
Searle, J. 1992. The Rediscovery of mind. Cambridge: The MIT Press.
Searle, J. 1998. Mind, language and society: philosophy in the real world. New York: Basic Books.
Searle, J. 2013a. Can information theory explain consciousness? New York Review of Books, pp. 54–58 (10 January 2013). Retrieved from http://www.nybooks.com/articles/2013/01/10/can-information-theory-explain-consciousness/. Accessed 21 June 2018.
Searle, J. 2013b. Reply to Koch and Tononi. New York Review of Books (7 March 2013). Retrieved from http://www.nybooks.com/articles/2013/03/07/can-photodiode-be-conscious/. Accessed 21 June 2018.
Shannon, C.E. 1948. A mathematical theory of communication. Bell System Technical Journal, 27, (379–423 & 623–656, July & October).
Tononi, G. 2014. (May 30) Why Scott should stare at a blank wall and reconsider (or, the conscious grid) [Web log post]. Retrieved from Shtetl-Optimized, http://scottaaronson.com/blog. Accessed 23 August 2017.
Tononi, G. 2015. Integrated information theory. Scholarpedia, 10(1):4164. http://www.scholarpedia.org/w/index.php?title=Integrated_information_theory&action=cite&rev=147165. Accessed 23 August 2017.
Tononi, G. & Koch, C. 2015. Consciousness: here, there and everywhere? Philosophical Transactions of the Royal Society, Philosophical Transactions B. https://doi.org/10.1098/rstb.2014.0167
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Fallon, F. Integrated Information Theory, Searle, and the Arbitrariness Question. Rev.Phil.Psych. 11, 629–645 (2020). https://doi.org/10.1007/s13164-018-0409-0
Published:
Issue Date:
DOI: https://doi.org/10.1007/s13164-018-0409-0