Back to Home

Blackholes2 Forum Message

Forums: Atm · Astrophotography · Blackholes · Blackholes2 · CCD · Celestron · Domes · Education
Eyepieces · Meade · Misc. · God and Science · SETI · Software · UFO · XEphem
RSS Button

Home | Discussion Forums | Blackholes II | Post
Login

Be the first pioneers to continue the Astronomy Discussions at our new Astronomy meeting place...
The Space and Astronomy Agora
Re: Time Travel?....To What Exactly? / OO-Universe

Forum List | Follow Ups | Post Message | Back to Thread Topics | In Response To
Posted by daViper on August 9, 1999 02:51:50 UTC

: >>I do think you MAY have missed my point a : >>little, or at least in some cases : >>misinterpreted what I was trying to say.

: My interpretation was that you where trying to use our arbitrary bias towards using base ten as a case that could be generalized to infer that our perception of time might be as arbitrary. ::::::::::::::: Not arbitrary. Actually quite specific and possibly even wrong.

My problem was the 'impedence mismatch' that resulted in my brain when I attempted to follow the argument through. The base ten issue is an issue of representation, not misperception - numbers still do what they *should*, regardless of the base they are expressed in. If time did not behave the way it *should*, i.e. if a person or group has the perception that the past or future are "actual physical states of existence", but it is the case that they are not, then that is genuine misperception. The person of group who held these ideas would not simply be representing them in in arbitrary fashion - they would be just plain wrong. :::::::::::::::::: As I said, not arbitrary. : >> It’s all in good clean fun anyway right?

: Of course, I doubt that the mysteries of time and space are going to unfold before us on this board ;0

: >>If it where the case that switching the >>representation of a value to a different base >>somehow invalidated what we know about >>mathematics (like 1 + 2 = 3 but 10 + 01 11 - >>as in: oops, that law of addition stuff was >>only in my decimal thinking head), then the >>analogy with our perception of time (or lack of >>it) would be relevent. : >>: :::::::::::::: : >>: I think you are making my point for me here. :::::::::::::: My original statement never suggests that switching base usage invalidates any other base or math. In fact, I thought I was very clear that it does not, and tho I said it differently, it is still the same thing you are saying here.

: You lost me. I can't see how I am making your point. I though I was demonstrating the reason I believed that you had over generalized from your base case...

: >>: Sorry, compilers and function libraries may >>make it easier for the higher level language >>software developer who works in Visual Basic, >>Foxpro, Java, and HTML to write code that does >>not itself have to deal with manipulating the >>conversion required to store decimal numbers in >>binary format maintaining the integrity of the >>decimal point and the degree of accuracy, but >>the people who write the compilers, >>interpreters, and assemblers do. It still has >>to be done AT SOME LEVEL whether the ‘90’s >>application programmer has to deal with it or >>not.

: I think all this was done a while ago with any mature language running on a mature platform. I suspect that even the compiler writers are simply reusing existing code, and are only occasionaly revisiting the code when they are translating an existing language to a new platform with different op-codes and/or register layouts, or they are developing a new high level language altogether. Neither of these events occurs all that frequently (I'll qualify this by limiting universe of discourse to commercially viable application programming languages - who knows how many homegrown interpreters/compilers there are out there for special languages like CAD, CNC, DMIS, etc). My point is that C++ compiler code that handles basic floating point types on the I386 platform isn't all that volatile :) Without out a doubt, *the vast majority* of programmers never give this issue a second thought in their day-to-day activities, and probably couldn't tell you the specifics of how this is accomplished without looking it up. (I dare say that there is a large subset of folks who write code for a living these days who still wouldn't get it even if they did - but they are still capable of producing useful work output as programmers). ::::::::::::::: It is immaterial when the original conversion processes were written. The software still exists, runs and performs the conversions at this very moment. Chip architechiture advancement itself necessitates a reworking of some of these base machine code functions to run properly. The process occurs whether it is from old routines or fresh ones. Every time a new chip design comes out, some new microcode has to be written to operate it's gates. This code determines how compilers and assemblers have to manipulate higher level functions to operate properly. Until we come up with another technology entirely different from what we use today, we will still have to do SOME machine coding at the gate level. I don't any more, but there are still those who do. The user and even application programmers don't deal with this anymore, but engineers at the design level still do. The day is already comming where even some C++ programmers do not understand the basic circuitry functions of a PARTICULAR chip design. (No offense to my fellow C++ programmers out there, but this actually goes to the portability issue of C++ itself. Not a slur on programmers.) My point is that it occurs. In any event, a discussion on the history of compiler theory is not salene to this time issue I'm bringing up at all. It was proffered merely as an example of how decimal is not something that works across the board in life.

: >>(By the way, “The Commodore” was one of my >>favorite heroes and I had the pleasure of : >>attending 3 of her lectures. I used to have a >>Bulldog I named after her.)

: Cool.

: >>I agree here with you also but it still would >>not alter the fact that binary representation, >>in and of itself, is the ONE math base that ANY >>civilization could use for cross communication >>with another.

: I'm afraid I don't buy this. I believe that any civilization that understands mathematics need only see the value tokens enumerated from least to most significant, and the arithmatic operator and equality symbols demonstrated with a few simple equations(0+1=1,0+2=2, etc); they would get the drift pretty quickly. Throw in a few model equations and/or drawings for, say, the area of a rectangle, circle, etc., and it seems even more likely that they would get it PDQ. The only advantage that binary has is that they would need only learn two symbols, but, if they understand math, it shouldn't be any harder for them to understand how to change bases than it is for us. Once they get the idea that you're doing mathematics with your symbols, and not conveying the directions to the nearest McDonalds, it's all downhill from there. ::::::::::::: Sorry, I don't buy YOUR analysis here. I think you are again putting words in my mouth I never implied or said. I never even SUGGESTED that any society would adopt binary as math of choice. I MERELY stated that binary would be the ONE math base ANY civilization would be able to understand as presented by another. Regardless of what math base(s) any society used, binary would always a good starting point for understanding or communicating with an unknown race. Whether any society would continue to use binary as it's main method is again, another digression onto another topic that has nothing to do with my time concept point.

: >>Whether THEY would use it to develop computers >>is not an issue to my overall point. It is >>probably the only math base you could make >>this “universiality” statement about however. >>It's why we put it on the Voyager spacecraft >>and sent it as a signal from Arecebo.

: If it is a fact that we use base ten due to our physiology then this is a demonstration of one of our *natural* biases. But we don't just start out biased, we are blessed with the wonderful ability to go off an cook up an infinite number of *man-made* biases. The idea that we use base ten because we have ten fingers sounds good to me. But, isn't it possible that the reason that we use a base of higher order than binary is that when we discovered that math was something we wanted to do, it was just too damn hard to chisel all those ones and zeros to do any practical calculations with a reasonable degree of precision?(I wonder what the area of my field is? - Damn! Better go cut some more stone...) : Compactness of representation is also good reason *not* to use binary in everyday computation. ::::::::::::::::: See last comment.

: Sure, *we* would find it real handy if we received a binary encoded signal, since that's how *our* computing machines and communications infrastructure operate. Is it at all possible that the reason "..why we put it on the Voyager spacecraft and sent it as a signal from Arecebo" is because of our tendency to extrapolate everything we "know" as being "universal". If you ask me, the analog VHF television signals that we have been sending out are as (un)likely to be of any use to another civilization as other signal we send, regardless of how we encode it. Regardless of the *sophistication* (gasp) of the signal we broadcast, it seems very likely to me that the only purpose it will serve is that of a beacon that shouts out "we have intelligence and here we are", assuming that ET is looking for, or even knows what to do with, an RF signal in the first place. ::::::::::::::::: Hmmm. We should discuss SETI in another thread. You may find that I agree with you here, but again......

: ::::::::::::: Thank you! :-)

: >>My original dissertation is aimed at the folly >>of thinking that going to “the past” or “the >>future” as if they are genuine tangibles, is >>the issue.

: The human imagination is a wonderful thing, isn't it? It's like our brains work overtime trying to connect all the dots, even where there are miles of empty space between them.

: Maybe where the trouble starts is when things get too abstract. Let's see if I can agree with you by bringing this high falutin' physics stuff down to a level that I can relate to. ::::::::::::::::: I believe the abstract is in thinking of the past or future as tangibles. We humans have complicated the whole time issue beyond what is necessary. THIS IS MY REAL POINT. You may be making it for me again here.

: It seems pretty clear that after you burn a piece of wood, what you end up with is heat, ash, and smoke, - and also *no* way back. In this specific case, one does not expect there to be a way to "unburn" the piece of wood by recombining these end products of combustion. Traveling backwards in time would be like unburning all of the fires in the universe at once! How the hell could one expect to do that, when it seems pretty obvious that there is no way to undo the effects of one fire, let alone a universe full of them!

: Velocity often gets linked with this notion of traveling into the past, so let me explore that idea. If I where to observe a star, located 100 light years away, blow up, even if I could travel to there at that same instant (remember light took 100 years to travel the same distance, so I am really m-o-v-i-n-g), wouldn't I still be 100 years too late to do anything about it (assuming that I am able to do something that can halt a star's destruction in the first place)? If I can't get ahead of cause and effect traveling at infinite velocity, why then should merely traveling faster than the speed of light change anything? Maybe I am missing something here, but I tend agree with you, at least in this 'direction' of the 'timeline' (sometimes we are the innocent victims of our own language).

: "Travel" to the future is a different story. If I accept the fact that any "travel to the future" is done on a one way ticket, then yes, I believe I can "travel" to the future. Since we all are "traveling to the future" simply by continuing to exist, let's take "travel to the future" to mean "at a rate greater than what I am currently experiencing". To do this, I simply pack my bags into my frame of reference, step on the gas (a.k.a the differential rate of time controller), head out for a bit, and come back, and let time dilation do all the work. Since my working definition of "travel to the future" only requires that *I* experience this 'movement through time' at a rate greater than what I was experiencing in my old frame of reference (which I return to when I finished my joyride), it's mission accomplished! Of course, it would be easy to overlook the fact that the 'future' that I am arriving at wasn't already there, the way that the state of California would be if I where to travel to it. No, it came into being precisely as I arrived in exactly the same way it would have if I did not take my trip, and just sat around waiting for it to happen. In fact, I still waited for it to happen, just not as long ;)

: Having conducted this little excersize, I find that we are in complete agreement, one can no more travel through time than one can wait for a distance. (Damn! A simple check for dimentional consistency would have saved me from all this!)

: I've just got one little unresolved issue that you may be able to shed some light on. What if during my journey to the future (as described above) I accidentally stepped on the gas too hard, and achieved 100% light speed? Wouldn't my differential rate of time with respect to *any* other frame of reference be infinite? It seems like I might then be in *real* trouble because I would, quite literally, "run out of time". With no way to go back to the past, it looks like I'm toast - what do you think? Of course there is always an upside. Since time stops in my frame of reference, I get to live forever. Also, I could travel *any* distance, in what appears to be an instant in my frame of reference. But if that's true, then I must be everywhere at once! Let's see, I live forever and I'm everywhere at once, this begs the question: If I achieve light speed, will I become God?!

::::::::::::::::::: (I thought you turned digression mode off.) :-)

WHEN (and IF) we can achieve light speed, I'll re-evaluate my concept.

Semantically, *I* do not call time dilation (the classic Twins Paradox) time travel since both twins still progress along the same time line. The relatively faster one just has his clocks, including his biologocal one, run faster.

Since we do in fact "proceed" into the future as a part of our normal daily lives, I also see no need to call this time travel since "travel" to me implies that I take some action, to cause something to occur, or that some movement is required to accomplish the task.

What is classically referred to as "time travel" (going to the past or the future) is and always will be impossible in my opinion. The reason is not due to undeveloped technology as some believe, but due to a misconception as to what the "past" and the "future" are in the first place. You cannot travel to somewhere that is not there.

THIS is the essence of my point. Nothing more, nothing less.

I could have used mythology or religion as a topic to demonstrate how concepts are formed and become commonplace, but I chose decimal math since I thought it would be a "safe" analogy that wouldn't spark some side philosophical discussion that was off point.

Maybe I was wrong on THAT score.

I would think that anyone could see how DECIMAL math is but a concept of human thought, not math itself. The computer operation analogy was merely to drive the point home about it being conceptual as a method when other methods not only exist, but are used every day. Thats all.

I stand by the original composition.

Thanx.

Follow Ups:

Login to Post
Additional Information
Google
 
Web www.astronomy.net
DayNightLine
About Astronomy Net | Advertise on Astronomy Net | Contact & Comments | Privacy Policy
Unless otherwise specified, web site content Copyright 1994-2024 John Huggins All Rights Reserved
Forum posts are Copyright their authors as specified in the heading above the post.
"dbHTML," "AstroGuide," "ASTRONOMY.NET" & "VA.NET"
are trademarks of John Huggins