Hi Michael,
You're welcome, and thank you for responding to me. With a few little loose threads to tidy up, it seems to me that you and I agree on the substantial issue of the relationship of mathematics to reality, on the usefulness (actually the non-usefulness) of the Axiom of Choice, and that this subject borders on the metaphysical. I am a little disappointed that you seem reluctant to go into metaphysical speculation, so I won't take us there. If you change your mind, let me know because I think that kind of speculation is the most fun.
Let me tidy up some of those loose threads by answering your questions and commenting on some of your statements.
You said, "There seems to be a dissociation between logic and ideas."
I agree with this, but to elaborate I would have to talk metaphysics. So I won't.
You said, "[T]he idea that the number 3.333... is a correct representation of 10/3 is often used as a trick to "prove" that 9.999.... = 10."
There is no trick involved here. 9.999... = 10 is a true statement in mathematics. There is nothing profound though. It is true simply by virtue of the definition of the ellipsis and involves the definition of a limit. We can go into that in detail if you'd like.
You said, "The axiom of choice, in my amateurish interpretation, seems to say that not only you can think of a number so small you can't even think about it, you can also think of a number that's even smaller than that. Which is funny when you think about it."
Yes, it is funny. Here's my way of thinking of the axiom of choice. Rather than thinking of small numbers, I prefer to think of large ones. There really is no difference since one is the reciprocal of the other. It's just that in the foundations of arithmetic, the natural numbers are the starting point and the fractions come later.
To me, what is going on in mathematics is the exploration of what we can think. It's not hard to think, but it is hard to describe what we think. The job of mathematics is to rigorously describe what one thinks so that another can know absolutely for sure what the first has thought.
A fundamental part of this is defining terms. In fact, any term used in mathematics must either be specifically acknowledged as an undefined primitive, or it must be defined in terms of primitives and other previously defined terms. Now, when you think about this process with respect to the natural numbers, you realize that an infinite number of them cannot be defined in this way.
The axiom of choice (in my amateurish view at least) essentially says you don't have to worry about defining an infinite number of numbers. You can assume that there is some automatic process that has sort of ground out an infinite set that you can use. (That, to me, qualifies as real trickery!)
So, if you really do have an infinite set, say, of integers, then you can always find one bigger than any integer someone presents to you. Or, equivalently, you can always find a non-zero number smaller than any number someone presents to you.
If you don't have an infinite set, there must be a largest integer and a smallest non-zero fraction. But even if there is a largest integer, nothing prevents someone from defining an even larger one later on. This can continue indefinitely, but it can never become infinite. It will remain finite at all stages.
Think about it as the display size on your calculator, or the number of bits in a word in your computer. With a given calculator or computer, there is a largest integer which can be used in calculations. Attempts to use a larger one will just produce an overflow or some other kind of failure. But nothing stops someone from inventing a computer or calculator that can handle even bigger numbers.
You asked, "Do you think the concept "real" means anything in mathematics?"
No. It means nothing in mathematics.
You asked, "Is there any difference between, say, "three apples" and "three real apples"?"
Not in mathematics, there isn't. Neither term is used in mathematics.
You asked, "What does the "real" in front of "apples" mean?"
It means nothing in a mathematical context because it doesn't appear there. But, in a vernacular context, here's how I would interpret the meaning of 'real': Case 1) Suppose I asked you to imagine three apples. After confirming that you had done so, suppose I asked you to take a bite of one of them. The best you could do would be to imagine biting one. Case 2) Suppose you had three real apples. After showing me the apples, suppose I asked you to take a bite of one of them. In this case you could really bite a real apple. That's the difference.
You asked, "But isn't [the fact that observations of nature are inconsistent with a continuous math] equivalent to saying that a continuous math is not self-consistent?"
No, I don't think they are equivalent even though I think both statements are true. Continuous math could be consistent even though our universe might be discrete. And, observations of nature might be discrete even though the underlying reality is continuous.
You said, "It seems to me any self-consistent set of ideas must agree with observations, since all self-consistent systems are tautologies."
Even if the underlying reality were consistent, there is no guarantee that observations must be. There may be many distortions and illusions at work between reality and our observation of phenomena.
You said, "I suppose [by "we"] I really mean "I" since I don't know how other people think, I just assume they must think as I do."
To respond to this I would have to get into metaphysics, which I promised not to do.
You wrote, "So that we don't get into metaphysics, all I meant was that "all finite segments of line have an undefined number of points". That gets rid of "the one"."
Not quite. In the new phrasing, who exactly is the definer who failed to define the points? But we are staying out of metaphysics.
You asked, "Hmm... do you think a discrete system for math would be free of contradictions?"
I think it could be. It would all depend on the consistency of the axioms. I think that there are discrete mathematical systems which have been proven to be consistent but I am not really sure. At any rate, the inconsistencies brought along with the AC would be absent. Furthermore, I see no reason that a discrete math couldn't contain every important theorem used in applying math to science. I have an idea on how to approach the definition of limits and continuity in a discrete system if you are interested. As I said, this has been a favorite subject of mine for a long time.
Thanks for a stimulating conversation, Michael.
Warm regards,
Paul |