Main Page | See live article | Alphabetical index

Explanatory Gap problem

The basic idea of the Explanatory Gap is that human experience (such as qualia) cannot be fully explained by "mere" mechanical processes; that something "extra" (more or less metaphysical in nature) must be added to "fill the gap". The Explanatory Gap has vexed and intrigued philosophers and AI researchers alike for decades and caused considerable debate.

To illustrate a no-gap condition, imagine a modern computer: as marvelous as these devices are, their behavior can be fully explained by their circuitry, thus there are no causes left over for consciousness. According to believers in the explanatory gap theory, any AI built out of software running on such a machine would also suffer full accountability and thus not be conscious.

The Explanatory Gap is perhaps better known to people as the Chinese Room problem defined by John Searle.

See also: