Sunday, November 1, 2020

The Lucas-Penrose Argument

Brace yourselves, this one can melt your brain.

In the early 20th century, it was thought that mathematics could be made into a complete formal system. This is a system in which every element has a complete definition, every entailment is deductive (so that conclusions necessarily follow from premises), and which contains no contradictions. But some basic concepts are unformalizable. "Truth," for example, allows us to form the Liar Paradox: "This statement is not true." If it's true, then it's false, and if it's false, it's true. So no formal system can have a truth predicate in it. (This isn't a mark against truth, btw.) One motive for this is a system with a contradiction leads to the principle of explosion, since ex falso quodlibet -- from a contradiction, everything follows.

Anyhoo, Kurt Gödel, inarguably the greatest logician of the 20th century, suggested we use a concept in place of truth that IS formalizable and doesn't lead to a paradox: provability. "This statement is not provable" doesn't lead to a problem like the Liar Paradox. But since such a statement can be made within any formal system, and since any such system must involve deductive provability, it follows that there can be no complete formal system. This is the intuition behind Gödel's Incompleteness Theorems. We'd been chasing a mirage.

This was around 1930. About the same time we had huge strides made in artificial intelligence by the likes of Alan Turing, Alonzo Church, etc. Turing came up with the idea of a Turing machine, which is an instantiation of a formal system, the cause-and-effect processes of the machine standing in for the deductive ground-consequent relations of the formal system. But since any formal system will have a statement within it to the effect of "This statement is not provable within this system" (called a Gödel sentence), such would also have to be the case for a Turing machine.

This is a problem because a Turing machine can only affirm provable claims, so any given machine will have a Gödel sentence which it cannot affirm. Human minds, however, have no such limitation: we can see that there is a Gödel sentence within our own systems of thought and affirm it, recognizing that it is correct. It is correct that "This statement is not provable within this system" is not provable within that system. This has two consequences: 1) Human minds cannot be reduced to Turing machines. They cannot be fully explained by the mechanistic cause-and-effect processes that are going on in the brain. There is an element of the mind that goes beyond it, and this element is truth-conducive. 2) Turing machines, and artificial intelligence in general, cannot fully duplicate the processes of human minds. They may be able to duplicate the end-products, but they can't produce them the same way that human minds do: through non-deductive (non-formal) reasoning. They can only do it via mechanistic cause-and-effect processes which don't have to be truth-conducive in order to arrive at those end-products.

This conclusion was reached by Gödel himself in his 1951 Gibbs Lecture, "Some Basic Theorems on the Foundations of Mathematics and Their Implications", but it wasn't published until the third volume of his Collected Works came out in 1995. J.R. Lucas -- who in writing this post I have learned passed away earlier this year, which devastates me -- however, wrote an enormously influential essay in 1961, "Minds, Machines, and Gödel" which presented the same idea. It motivated a lot of objections which Lucas responded to in philosophy journals, and then he published his book "The Freedom of the Will" in 1970, the last third of which is on the implication of Gödel's Incompleteness Theorems for the mind and AI. You can read most of his essays online at https://web.archive.org/web/20160718073705/http://users.ox.ac.uk/~jrlucas/. Later, mathematical physicist Roger Penrose defended the argument in his own way in his books The Emperor's New Mind and Shadows of the Mind.

Simple, no?

1 comment:

Doug said...

Brilliantly encapsulated. Thanks for taking the time to do so!