Hi Michael,
What a great post. I am particularly delighted that you brought up the Axiom of Choice, which is one of my favorite subjects. I have never had the opportunity to discuss the AC in any detail with anyone and I am eager to discuss it with you. I hope you will respond.
From what you wrote, I think you and I see pretty much eye-to-eye on the question of relating mathematics to reality. However, consistent with what you wrote some weeks ago about the difficulty in communicating any ideas via language, I think Wittgenstein might have had a field day picking the sentences of your post apart.
Not that what you wrote is wrong. It's just that in order to communicate I think it is necessary to try to get past the words and try to get down to the ideas that we are trying to get across. So let me gently take a close look at what you wrote.
I am no expert, but I think that the two different ways in which you stated the AC are not rigorously correct. In one, you said it is always possible to think of a number between any two other numbers. In the other, you said "a finite segment of line can have as many points as one wishes".
The AC is an axiom of set theory which, foundationally, means that at the time the axioms are chosen, numbers or points have not yet even been defined. I agree with you, though, that once having adopted the AC in your mathematical system, if you go on to define such things as points or numbers, the AC leads directly to your two statements characterizing the AC. So I agree that even though your characterizations might not be rigorous, they nevertheless do capture the fundamental consequences of the AC.
You said, "I believe it's nonsense to deny the axiom of choice."
Since I have frequently mounted the soapbox and denied the AC -- always to an indifferent audience, BTW -- I found that sentence of yours to be the juiciest I have read in a long time. I am eager to discuss it with you. I would like to slowly and carefully go into detail about why I think denying the AC is a sensible thing to do, and I would like to have you slowly and carefully tell me why you think it is nonsense. Let's start by looking at what you have already written.
You said, "Once you allow for fractions, it becomes impossible to state there's a logical limit to how much you can divide a quantity."
Watch me... There is a logical limit to how much you can divide a quantity. There,...according to you, I have just done the impossible.... Just kidding. I was just having fun with language, exploiting the ambiguity that Wittgenstein tried to reveal.
Seriously, the key question here is, What do you mean by "quantity"? There are, IMHO, three separate possibilities: 1) the amount of something real 2) some measure of a mathematical concept in a system that includes the AC, and 3) some measure of a mathematical concept in a system that excludes the AC.
For case 1), we have only speculation which goes back before Democritus. Various people have speculated that (at least some) real things are continuous and thus infinitely divisible. Various other people have come along at various times to speculate that things are grainy at some fundamental level and thus there are indivisible "atoms" comprising everything real. This speculation continues today and we really don't know. The current thinking, though, as you point out, is that things are really grainy if you go down small enough.
In case 2), your statement is correct: There is no limit, logical or otherwise, to "how much you can divide a quantity". It should be noted here that nearly all current mathematicians and their mathematical systems accept and include the AC.
In case 3), your statement is incorrect. Without the AC, all mathematical systems would be discrete and would necessarily have limits on, for example, the sizes of integers, or the smallness of non-zero numbers or intervals. This approach to mathematics virtually died out about a hundred years ago with Kronecker and Brouwer. I happen to believe that the worlds of mathematics and science would have been more useful if the AC had not been accepted. I think it is not too late to shift gears and pursue mathematical systems denying the AC.
You go on to say, "Yet such a notion[, i.e. the AC,] creates problems, Zeno's paradoxes being some of them."
Right on! It creates problems in spades. Goedel's Theorem is also among the problems created by the AC. In my opinion, mathematicians should stand back and cooly look at the consequences of the AC for what they are. To start with, we have the paradoxes discovered right off the bat by Cantor as soon as he rigorously introduced the notion of infinity into mathematics. Then we have Russell's paradox, which you mentioned in another post. The problem culminates in Goedel's Theorem, which in my opinion can be paraphrased as, OK, if you want to allow the AC, i.e. the notion of infinity, into your system, then you have to accept the fact that your system will either be incomplete or inconsistent, and you really can't figure out which.
To me, if you look at those unavoidable consequences, it should be obvious that you wouldn't want to choose that situation. Instead, mathematicians have sort of swept the nonsense and paradoxes under the rug by sort of agreeing not to talk about the particular statements, or sets, or notions, that lead directly to the paradoxes. Russell did a pretty good job of identifying just exactly what should be swept under the rug and avoided in order to (hopefully) keep the mathematics consistent. I think I speak for the now dead Kronecker by pointing out how ridiculous this position is.
You asked, "[W]hy is it that a continuous math leads to the discovery of a discontinuous reality?"
I think the answer is that it did not. It was not the continuous math that revealed the graininess of nature, but the direct examination of nature herself. In fact, the observations of nature are inconsistent with a continuous math. This is no problem, though, because all human uses of mathematics and calculation involve only discrete and limited numbers. As you point out, each and every number ever represented in the decimal system, whether on paper, in a mind, or in a computer, has had only a finite number of digits in its representation.
Your answer to your question is more general. You said, "The reason, as with everything in math, is related to the way we think."
Now this really excites me. I agree completely with your answer, but I think I read more into it than you intended. The reason it excites me so is your inclusion of the personal pronoun. As I have asked on many other occasions, "What do you mean by 'we', white man?"
I am chomping at the bit to tell you how I interpret the word 'we', but this post is getting too long as it is. So to reduce the risk of losing you, I'll wait for your interpretation before going into mine. Anyway, anyone who has read my earlier posts knows who I think "we" is.
(I almost sent this post without this paragraph, but I just couldn't resist giving you a hint of who I think "we" is. I think it is the "one" you mentioned when you said "a finite segment of line can have as many points as one wishes". Yes, literally "the One".)
Anyway, thanks for your invitation to comment on your post, and thanks for reading mine. I look forward to discussing this further. Let me close this by commenting on your summary:
You said, "What am I saying here? Simply, that while the AC is true as a guiding principle for abstract mathematical thinking, it's not true as a guiding principle for real problem-solving."
I would suggest the following version: "What am I saying here? Simply, that while the AC is [] a guiding principle for abstract mathematical thinking[, leading to contradictions if you accept it and discrete systems if you don't], it's not true [or useful] as a guiding principle for real problem-solving.
Warm regards,
Paul |