Hi Nathan,
Welcome to the forum. I don't think I have talked with you before. (Although I suppose Alzheimer's is sort of like insanity in that you can't tell if you have it yourself.)
"I was browsing the archives and ran across someone saying that unless a number be conceived, it does not exist. (I think it was Paul)"
There's a good chance it was me. In describing my own beliefs about the nature of reality, I have said that I think God did some mathematics long before the big bang and that part of that was the construction of some numbers. Those numbers, and any other mathematical concepts she dreamed up comprise the Platonic world. Hence I probably qualify as a NeoPlatonist.
Or, in another context, the foundations of mathematics, I have described part of the controversy between Cantor and Kronecker, among others, over whether axioms should be accepted which allow for the definition of infinite sets of numbers. I side with Kronecker and think things would be better if we denied the Axiom of Choice and thus not allow for any infinite sets. Our preferred system would contain only a finite number of numbers. In other words, numbers would not exist in the system unless and until they had been specificially defined (i.e. conceived). Hence I probably qualify as a constructionist or an intuitionist.
On other occasions, I have tried to explain the difference between numerals, cardinal numbers, ordinal numbers, and sets of real things, in an attempt to clear up some confusion as to what the word 'existence' means wrt numbers. In this attempt, I probably qualify as a failure.
"This was used as rationale for a noncontinuous universe theory."
Close, but not quite. I noted that current science seems to be disclosing a noncontinuous universe. This was used, in part, as a rationale for trying to persuade others to see the wisdom of siding with Kronecker and me in the foundations of math controversy. My argument was that if the universe is grainy, wouldn't a grainy math seem more suitable to describe it?
"This seems to me to be an unwarranted anthropic bias..."
I suspect it seems that way to you because you have mixed up a few different contexts.
I admit that my philosophical views are rife with anthropomorphisms. I attribute to God many (seemingly) human attributes such as consciousness, will, intellect, memory, imagination, etc. I agree that many people would claim such attributions are unwarranted. For example, I think there are some sects of Islam (or Buddhism or something else) in which there is a death penalty for anyone daring to utter an opinion as to some attribute of God. Also, I think I remember someone on this forum recently hinting that it is "unfashionable" to anthropomorphocize these days.
But in my view, it is not unwarranted. The way I see it, philosophy is all speculation. Nobody has any facts to go on (except for one exception: I believe it is a fact that thought happens.) So, in the absence of facts, it seems fair and useful to me to use whatever hints we have to build our speculations. Since we seem to be thinking humans, it seems a good bet that some of the attributes we seem have actually exist in fact. It is those that form the starting point of my philosophical system, and it is those that give it the taint of anthorpomorphism. I admit it, but I don't apologize for it.
Now, in the context of the foundations of mathematics, there is no anthropomorphic bias whatever. The decision to choose the Axiom of Choice or not has nothing to do with anything anthropomorphic, nor does the choice of any other axiom. Nothing anthropomorphic shows up anywhere in mathematics.
"that begs me to ask about unseen falling trees."
I'm not sure exactly what question you are begged to ask, but let me guess that it is the old trick question of whether a falling tree makes any sound if there is no one around to hear it. It is a trick question hinging on the definition of the word 'sound'. Dictionaries are ambiguous in their definitions of 'sound'. If sound means the sensation of hearing vibrations, the answer to the trick question is "no". If sound means the vibrations then the answer is "yes". There is nothing philosophically profound about this question.
"If you accept the representation of the number three in language as sufficient for its existence,"
In the context of calculation or computation, the representation of the number three is sufficient for its existence. However, the representation can't be made until after a definition has been made. For example, a plastic key with the numeral '3' embossed on it will not be useful for calculation unless it is placed on top of a microswitch which is part of a wellthoughtout calculator or computer circuit. Once that is done, you only need to teach someone that this particular key is for three, and that will be sufficient for them to use the calculator or computer.
In the context of mathematics, accepting the representation of the number three in language is NOT sufficient for its existence. Nothing is said to exist in mathematics unless and until it has either been defined or accepted as a primitive undefined term. So the number three doesn't exist in mathematics until it has been defined. And, unless you have studied some mathematics, you probably don't appreciate what this involves. I have done this once before on this forum, so I'll do it again since it will be easier this time. Hold on........Aaaaagggghhh! It wasn't as easy as I thought.
I went to the shelf to grab my copy of Whitehead and Russell's "Principia Mathematica (to *56)" and found an empty space where it should have been. Somehow I have misplaced that book. Anyway, what I was going to do was to find the page number on which the number one was finally defined. It's on page two hundred and something. And those two hundred pages contain the necessary groundwork of mathematical development necessary to be able even to make a coherent definition of the number one. Once the number one has been defined, it doesn't take nearly as many pages to define the number two. And from there, the number three is defined.
My point is that in a rigorous mathematical context, the definition of the number three is not at all trivial and the mere acceptance of a numeral (i.e. a symbolic representation of the number in language) is not at all sufficient for its existence.
"why not an extended representation that involves an algorithm?"
Again, the answer is different in different contexts:
In the computational context, there is no reason whatsoever. And it is done in spades. Numbers in computers and calculators are not represented by numerals. They are represented by algorithms. When you press a key on a calculator, or when a computer program begins a calculation, algorithms representing numbers are "run" at appropriate times to produce the result.
In the mathematical context your question summarizes the controversy at the turn of the 20th century. Cantor and Hilbert said, (and I paraphrase since I didn't take good notes. Just kidding  I wasn't actually there.) "Why don't we assume, as an axiom of math, that there is a magic algorithm out there somewhere that is SOOO fast and has already been running SOOO long, that now in the 20th century we can safely assume that it has already cranked out an infinite number of integers that we can use. Let's assume all those integers exist."
Kronecker said, (and I chimed in much later after I was born.) "You gotta be kidding. Assuming such a thing is preposterous! First of all, algorithms have to "run" in time. And they can only crank out a finite number of integers in a finite amount of time. So when did this algorithm start running? Hmmmmmmm? It would have had to have started infinitely long ago in order to have that infinite set of integers ready for us here in the 20th century. Harrumpf! I reject it out of hand."
As it turns out, Kronecker and I lost. Mathematicians have been following Cantor now for over a hundred years. But I, for one, still reject the notion.
"How about one as simple as a 'dozen', requiring one to count elements before ascribing 'dozenness' as a quality of a set?"
No problem. I have no problem accepting any number that has been defined, and twelve, by any other name, has been defined many times. It's the numbers that have not been defined by anyone that I can't accept. People following Cantor assume that there are these really, really big numbers that exist even though nobody ever defined them.
I think of it this way. If you had a calculator with a 15 digit display you could calculate with numbers up to 15 digits long. You couldn't calculate with bigger numbers or with numbers that gave an answer bigger than 15 digits. The numbers us to 15 digits are defined in the calculator and bigger numbers are not.
Now I ask you, can you imagine a calculator with a million digit display? You say, "Yes. It's a little hard to imagine a million things, but with a little work, yes, I can."
So then I ask, "Does that imaginary calculator exist?" And then while you are hemming and hawing about how it sort of exists in your imagination and it depends on what you mean by exist, etc., I interrupt and ask,
"Never mind about its existence, what I really want to know is can it calculate and does it give correct answers?" I don't think so.
That is what I think of Cantor's infinite sets. And, as for giving correct answers, look at the answers Goedel got from using them. I call it nonsense.
If I haven't answered your questions, Nathan, I hope I at least entertained you a little if I didn't shed a little light.
Warm regards,
Paul
""
""
