I thought of sharing this with you guys, in case anyone cares:
I was thinking about Zeno's paradoxes and why they are still so puzzling. I thought if we knew what we were doing we should have figured out the problem a long time ago, so I set out to find a solution. As it turns out, I found that Zeno's paradoxes are a direct consequence of the idea behind the axiom of choice. If you don't know what it is, the axiom of choice states that, no matter how small the difference between two numbers, it's always possible to think of a number inbetween them.
I believe it's nonsense to deny the axiom of choice. Once you allow for fractions, it becomes impossible to state there's a logical limit to how much you can divide a quantity. Yet such a notion creates problems, Zeno's paradoxes being some of them. So where do the problems come from? Also, why is it that a continuous math leads to the discovery of a discontinuous reality? The reason, as with everything in math, is related to the way we think.
Let me state the AC in another way: "a finite segment of line can have as many points as one wishes". A corollary would be: "a point can be as small as one wishes". So far so good, I hope, but now let me introduce a little twist: "a point can be as small as one wishes, but not smaller". What am I saying here? Simply, that while the AC is true as a guiding principle for abstract mathematical thinking, it's not true as a guiding principle for real problemsolving. When it comes to calculating the value of any number, you have to stop writing digits at some point. And whenever you do that (stop writing digits) you have defined the granularity of your solution.
What has become evident to me is that conscious beings, at least the ones like ourselves, will always find a limit to how small things can possibly be. And what sets the limits has nothing to do with the nature of the problem, and everything to do with the smallest number they can think about as the solution for a given problem. For instance, the smallest amount a economist can think about is one cent. In physics, the smallest amount is the length of an electromagnetic wave  since he doesn't know anything smaller than that, a physicist cannot make any meaningful mathematical statement involving sizes smaller than a wavelength. Hence, Planck's constant!
What that gives is that the socalled uncertainty principle is a feature of our way of thinking, not of reality. Any number we can possibly put forward as the solution to a mathematical problem contains an amount of uncertainty in itself. For instance, the number 3.14 means "everything from 3.140000.... to 3.149999..." The number 3.1415926548 means "everything from 3.14159265480000.... to 3.14159265489999....". All proposed solutions to mathematical problems are neccesarily uncertain, for the simple fact that we have to stop at some point.
All comments, questions, and criticisms are welcome. Thanks for reading this far.
