Will zero divergence ever be possible?

This is extracted from one of the original posts:"Is it physically possible for you to get back to THIS time line once you leave?"John Titor: Not with the machine I have now.

---

I understand the implications of the above statement are endless, but I simply cannot fathom that there would never be technology developed that would enable one to go back home to zero divergence. Remember, zero divergence is mathematically possible, so all we need is the technology to do so. Could take 200 years to develop, could take 2 years from 2036, but regardless, I personally believe it will be possible!

Divergence is a mathematical concept (Calculus):

http://www.ittc.ku.edu/~jstiles/220/handouts/The Divergence of a Vector Field.pdf

http://en.wikipedia.org/wiki/Divergence
Paula,He was talking about a damned time machine. A TIME machine! WTF was stopping him from zipping a few years into the future to pick up the better technology? He said that they were working on developing the C206 gadget.

But to the central question: why no zero divergence? It's a system that generally unfolds as if governed by the laws of thermodynamics (probably because it is governed by the laws of thermodynamics). Thermodynamics is statistical in nature. You can predict the future evolution of the system in a general way but you can't, never, ever, predict the results of the evolution to an arbitrary degree of accuracy. You'll always have to include error bars. That's how our world works. We don't even have to refer to quantum mechanics to see this issue.

And then, within the Titor Saga, one has to open her eyes and look at what he wrote. Boomer was one of the better writers to come along. But his science stunk it up - big time. Go back and look at his posts. Some of the time he referred to this divergence crap. At other times he referred to Everett-Wheeler and Many Worlds Interpretation, which has nothing to do with what he termed divergence. In fact he said Everett-Wheeler was correct. In reference to that he clearly stated that this is not his world and that, because of Many Worlds, he can't go back to his world. Not because of some ill defined (undefined if the truth be told) notion of divergence but because of multiple universes.

In any case it was a garbled, almost entirely unintelligible mess and an arbitrary kit-bash of several theories tossed together like a salad and served up ad hoc as he responded to different posts. You might notice that his timeline for departure suddenly got moved up a few months when Dave Trott, a PhD in astrophysics candidate at the time, engaged him on the Post-2-Post "I am from 2036" thread. Boomer suddenly decided that his window was open and he had to leave. Dave is now an astronomer in Denver.

 
Not because of some ill defined (undefined if the truth be told) notion of divergence but because of multiple universes.
Yep. As even someone with high school math skills understands, anything measured as a percentage has base units of measure associated with it. When pressed for those units of measure that define the relative measure of percent divergence, Titor could not answer except for the blindingly obvious answer: "It is an empirical measure." Well, of course it is. But that still does not answer with what units one measures something and compares it to something else with the same measure, from which one derived this divergence percentage.The formula for percentage is no secret:

(x-y)/x * 100 = Percent divergence between x and y. And the ONLY rule in the use of this equation is that the units of x and y must be consistent. They must be the same. But Titor could never define those physical units. Ostensibly because nothing would truly make sense in a time travel story when in fact one must travel through Space-Time.

RMT

 
With or without Titor, zero divergence is mathematically possible.

Also, error and uncertainty is used in science all the time, but is this merely a human factor?  Could digital math (aka computers) eliminate uncertainty and errors so that with the proper technology, zero divergence could be achieved?

 
With or without Titor, zero divergence is mathematically possible.
As I pointed out above, before anything is possible (physically or mathematically) one must first define the physical units of measure that define divergence. You cannot speak intelligently about any kind of "divergence" unless and until you describe, in physical units, what the divergence represents.  This is very basic physics.

Also, error and uncertainty is used in science all the time, but is this merely a human factor?
No.  It is part of physics as we know it.  See:
https://en.wikipedia.org/wiki/Uncertainty_principle

Could digital math (aka computers) eliminate uncertainty and errors so that with the proper technology, zero divergence could be achieved?
No.  In fact, digital math actually introduces errors.  They are called "quantization errors" which occur because we never use infinite number of bits to represent a physical measurement. An analog control system does not have any quantization errors, but it is still subject to noise (a form of error) and also uncertainty (see Uncertainty Principle above).  But once I decide to use a certain digital word size (16 bits, or 32 bits, or 64 bits, or any finite number of bits) then there is no getting around the fact that the Least Significant Bit (LSB) can only measure so accurately.  The LSB of any digital measurement is the actual definition of the quantization error of any digital measurement.
RMT

 
As I pointed out above, before anything is possible (physically or mathematically) one must first define the physical units of measure that define divergence. You cannot speak intelligently about any kind of "divergence" unless and until you describe, in physical units, what the divergence represents.  This is very basic physics.No.  It is part of physics as we know it.  See:

https://en.wikipedia.org/wiki/Uncertainty_principle

No.  In fact, digital math actually introduces errors.  They are called "quantization errors" which occur because we never use infinite number of bits to represent a physical measurement. An analog control system does not have any quantization errors, but it is still subject to noise (a form of error) and also uncertainty (see Uncertainty Principle above).  But once I decide to use a certain digital word size (16 bits, or 32 bits, or 64 bits, or any finite number of bits) then there is no getting around the fact that the Least Significant Bit (LSB) can only measure so accurately.  The LSB of any digital measurement is the actual definition of the quantization error of any digital measurement.

RMT
That darn infinity.  This is interesting.  Thanks.  Will study.  

 
Top