QM and General Relativity may have finally been merged

You can but don't hold your breath waiting for his/her answer. Just like I won't hold my breath waiting for his/her answer to my question if everything is alive and conscious wouldn't whatever it was that created the universe also be alive and conscious too?
Perhaps I should have started with an easier question, is a deterministic system deterministic; although even that would probably receive a response of "No".

Perhaps this: Is a dead person dead or alive? From what I gathered the answer to that would have to be "alive".
 
That is a rather ambiguous question that was covered before. The chaos that arises from some coupled differential equations is deterministic in principle but not computationally.

If you are referring to the cited abstract, I'm not familiar with much of the math references, but basically they are looking at probability distributions of a random process, which is not deterministic.

It's similar to a closed system of air. It's not possible to analyze it molecule by molecule, but a kinetic energy spectrum can be computed from state variables like pressure, temperature, given the volume.
Just because we can't compute it doesn't mean it's random. Quantum mechanics still follows the conservation laws, right? Doesn't that make it deterministic?
 
How so? I'm a curious person. I like to tweak and tinker with things.
OK, what I meant was that Wuwei (perhaps a practitioner of Qigong?) knows his stuff. He understands what chaos is, but perhaps doesn't yet grasp the mind games played by scruffy.

So far as I'm concerned chaos is a term to describe deterministic systems which can get into states such that we cannot compute their future. That's pretty much it unless your name is scruffy.
 
OK, what I meant was that Wuwei (perhaps a practitioner of Qigong?) knows his stuff. He understands what chaos is, but perhaps doesn't yet grasp the mind games played by scruffy.

So far as I'm concerned chaos is a term to describe deterministic systems which can get into states such that we cannot compute their future. That's pretty much it unless your name is scruffy.
Had to look that up. :lol:
 
Probably the best male voice ever.
Yes, I'm born and raised in Liverpool, my mother sang with the Beatles and I spent several years hanging out with their former manager Alan Williams, very fond memories, Lennon's death was a huge blow to many people.
 
So .... an interesting version of quantum foam is called "spin foam".

What's interesting about it is:

"Any evolution of the spin network provides a spin foam over a manifold of one dimension higher than the dimensions of the corresponding spin network."


There are also string foams of various kinds.


Some of these have analogs in materials.

For example:



 
While it is true that physics faces challenges at extreme scales such as the Planck scale, this does not represent a fundamental problem. Rather, it highlights the limitations of our current understanding and tools to probe and describe phenomena at such scales.

The Planck scale represents the smallest length scale and highest energy scale at which conventional theories of physics break down and quantum effects become significant. This presents a challenge in unifying quantum mechanics and general relativity, known as the theory of quantum gravity.

However, physicists are actively researching and developing new frameworks such as string theory and loop quantum gravity to address these challenges at the Planck scale. These approaches aim to provide a more complete and consistent description of the universe at all scales, including the Planck scale.

Therefore, while the Planck scale poses intriguing questions and requires further exploration, it does not signify a fundamental problem in physics. Instead, it serves as a frontier for advancing our understanding of nature and the underlying principles governing the universe.

View attachment 1010619

At the Planck scale, which is approximately 10^-35 meters, classical physics breaks down because gravitational forces become comparable to quantum effects. This leads to the need for a theory of quantum gravity to accurately describe the behavior of matter and energy at such small scales.

On the other hand, above the Planck scale, quantum mechanics is not applicable because the energy levels become so high that the effects of gravity dominate and the traditional principles of quantum mechanics no longer hold true. This is why there is a need for a theory that can unify quantum mechanics and general relativity to understand the behavior of the universe at both extremely small and extremely large scales. :)

Constants are used in science as simplified representations of complex phenomena to make calculations and predictions more manageable. While the universe may not be at a perfect equilibrium, constants help us understand and describe the natural world with a high degree of accuracy.

Even with the universe's dynamic and evolving nature, constants provide a foundation for scientific understanding and allow for meaningful comparisons and measurements.

It's important to acknowledge that constants are not meant to imply that the universe is static or unchanging but rather serve as tools to help us comprehend the underlying principles governing the universe.

By using constants, scientists can develop theories, conduct experiments, and make sense of the vast complexities of the cosmos. While the universe may be in a state of flux, constants remain valuable and reliable within the scope of our current understanding of the natural world.

Neutrinos are fundamental particles that play a crucial role in our understanding of the universe, but they are not the key to unifying quantum mechanics and general relativity.

The unification of these two fundamental theories is a major goal in theoretical physics, and while neutrinos have provided valuable insights into the nature of matter and energy, they are not the sole solution to this long-standing challenge. Some physicists believe that a unified theory, often referred to as a theory of everything, may involve concepts beyond neutrinos, such as supersymmetry or extra dimensions.

Neutrinos are important for studying the fundamental forces and particles in the universe, but the quest for unification remains a complex and ongoing endeavor that likely involves a deeper understanding of the fundamental nature of space, time, and matter.

==>A group of blind men come across an elephant for the first time and each touch a different part of the elephant's body. One man touches the elephant's tail and thinks it's a rope, another touches its leg and thinks it's a tree trunk, another touches its side and thinks it's a wall, and so on. They all argue about what the elephant is like based on their limited perspective, not realizing that they are each experiencing just one part of the whole truth. The moral of the story is that individuals may have different perspectives based on their limited experiences, and it is important to consider multiple viewpoints to understand the full picture. :)

View attachment 1010620

Source :


This is why I've been suggesting that a model based on symmetries and constraints is more useful than one that tries to pin down constants.

For example - one of the early successes of this way of thinking is Furry's theorem, which states that it is impossible to create "one" photon from the vacuum.

 
That is a rather ambiguous question that was covered before. The chaos that arises from some coupled differential equations is deterministic in principle but not computationally.

If you are referring to the cited abstract, I'm not familiar with much of the math references, but basically they are looking at probability distributions of a random process, which is not deterministic.

It's similar to a closed system of air. It's not possible to analyze it molecule by molecule, but a kinetic energy spectrum can be computed from state variables like pressure, temperature, given the volume.
I'm interested in "topological order", which is one of the places where randomness meets determinism. My main interest is in brain states related to criticality, but all of it has analogs in physics.



For example - this:


happens in the brain too. Neural networks undergo phase changes that are analogous to phase changes in matter. I described this previously in terms of a ball on an energy surface, and the pic in the last link shows how a symmetric structure can result in a random outcome at scale.

This is how and why I became interested in entanglement. To understand entanglement, one can begin with the simple (quantum) concept of "all possible paths". In other words, on a saddle of the type shown, there are two sets of paths, and a superposition consists of both.
 
Okay, so now I'm going to make a wild departure. From classical physics. And then show how classical physics is merely a subset of this departure. And then show how the departure can be used to understand everything from wave particle duality to an E8xE8 universe.

The departure is: the Cantor space. Commonly represented as 2^omega, where 2 is the discrete two point space (0,1).


The Cantor space is homeomorphic to the Cantor set, it can be mapped to the real numbers using a simple formula. It is a subspace of every perfect complete metric space (Cauchy space). The requirements for a Cantor space are: non-empty, perfect, compact, metrizable, and totally disconnected.

Now - we first build the Cantor analog of a Hilbert space, by first considering the product of two Cantor spaces, and using the Cantor function which is a base 3 operation.


This function has reflection symmetry and a pair of magnifications. The important part of this is it can be used to generate space filling curves. Which is what our spacetime is. According to this view, a Hilbert space "describes" our universe, whereas a Cantor space "generates" it.

The Cantor space is self-similar, to generate the standard Minkowski/Lorentz/Einstein spacetime we need a dimensional expectation value D=4. Turns out, the Cantor function is the cumulative probability of the 1/2,1/2 Bernoulli measure (as supported on our Cantor set). The resulting distribution is called the Cantor distribution.


This distribution has NO probability density function, because its derivative is zero almost everywhere and the integral of its derivative does not necessarily equate with the same distribution. However the moments can be calculated, with the peculiarity that all odd central moments are 0.

The Cantor set underlying the Cantor space as generated by the Cantor function, has a direct algebraic representation as a sequence of left-right moves down a binary tree, because composing choices yields a dyadic monoid that can be mapped directly to a string (a computer string, not a physics string - the latter part comes later :p ).

The next part is to show that Hilbert space is a projection of the Cantor space by measurement. In other words, a measurement is an observation we describe in terms of a Hilbert space, but what we're actually looking at (and measuring) is the Cantor space.

Much of this seems counterintuitive at first (the Cantor function is a version of the Devil's Staircase), but the results speak for themselves. As you know I'm interested in information, and what this shows us is the entire universe and everything in it (including all of mathematics) can be generated from the two primitives 0 and 1.
 
Here's a weird little piece of evidence, but it stands out like a sore thumb.

Listen:

"
Abstract: (Oxford Journals)
This is a comprehensive account of a particular recent development concerning the thermal character inherent in the quantum field as viewed from a uniformly accelerated frame of reference. That is, the power spectrum of the vacuum noise (or the detector-response function) seen by a uniformly accelerated observer in flat spacetimes of arbitrary dimensions is investigated and is shown to exhibit the phenomenon of the apparent inversion of statistics in odd dimensions. Its relation to the thermalization theorem is clarified. Also discussed are the closely related phenomena occurring in the vacuum stress of the Rindler manifold and in the noise seen by a comoving observer in the de Sitter spacetime, as well as those associated with the circular motion in the flat spacetime".

They're talking about the Unruh effect, how the vacuum looks to an accelerating observer.

 
So far as I'm concerned chaos is a term to describe deterministic systems which can get into states such that we cannot compute their future.
I think you are hung up on a label, you have beat this poor horse to death.

Just because a system is deterministic does not mean it's predictable. Chaotic systems are only predictable within boundary conditions. They don't "get into that state" They are in that state by definition.

A hurricane is a chaotic system. Hurricanes cause a lot of damage, and we spend a lot of money trying to predict them. We monitor them with satellites and aircraft and balloons, etc. We gather huge amounts of data and use supercomputers to model their paths.

When you look at the predictions, they are pretty good in the immediate future, but the predicted path is a fan shape, the uncertainty grows rapidly with time. That is the nature of chaotic systems (which is pretty much everything in nature).

Hurricanes are "deterministic", sure. Air obeys rules. Hot air always rises, air always flows from high pressure to low pressure, etc. But that does not mean hurricanes are predictable- we can never gather enough data or model it precisely enough to make anything other than a short-term prediction (that rapidly degrades).
 
Last edited:
I think you are hung up on a label, you have beat this poor horse to death.
Is he still being a pest? :p

Computer programmers are weird, some of them think algorithmic randomness is all there is. von Mises, Martin-Lof, Kolmogorov complexity, yadda yadda.

Others have the mistaken notion that randomness means a flat distribution.

But you're echoing what I told him from the git go, randomness is an observation, it's a measurement thing. It doesn't belong to the system, it belongs to the observer.

A random process is a function that generates random outcomes, which must then be observed or measured. In set theory there is "choice", you can look at it as the observer himself being the random process. But what happens next? The observer has to look at the outcome. If my choice is "green" and I reach in to extract the green ball, there is still a small but significant probability that I might grab the wrong ball and get an undesired (random) outcome. Even this tiny amount of randomness is still random. In probability theory we say the likelihood "vanishes" (but never entirely disappears), and we use terminology like "almost" always when we're being careful

Noise has some interesting qualities. One of them is "stochastic resonance", which is used in a lot of radio systems. You can add "more" noise to a noisy signal, to better recover the original signal. This is a common strategy in audio, in MRI, and in single cell recording of neurons.


The amount of randomness in a signal can be measured. Radio and audio people call it the signal to noise ratio. But the mathematician Alfred Renyi came up with a foolproof way of doing it. I used to design microphone preamps, there we deal with junction noise which is 0.9 nV/sqrt(Hz) and it places the lower limit of attainable cleanliness at around -137.5 dB for a near perfect amplifier stage.

The level of determinism in quantum mechanics is still unknown, because there are 20+ orders of magnitude between the generator and the observer. The law of large numbers makes everything look Gaussian, but at the Planck scale things are nonlinear and nonlocal.

So if you can "measure" an "amount" of randomness, it makes no sense to divide the world into deterministic systems and random systems. "Every" deterministic system is random to a certain degree.
 
Last edited:
Perhaps I should have started with an easier question, is a deterministic system deterministic; although even that would probably receive a response of "No".

Perhaps this: Is a dead person dead or alive? From what I gathered the answer to that would have to be "alive".

Your question was ambiguous because you didn't clarify if you were talking in general, or if you were specifically referring to the Plank-Weeler quantum foam reference you posted along with your question. Your sarcasm here indicates you weren't satisfied with my response. It would have been more fruitful if you asked for clarification.

This whole topic of random vs chaos is screwy because it is unclear it's a conflict of word usage or a conflict of underlying principles.
 
I think you are hung up on a label, you have beat this poor horse to death.

Just because a system is deterministic does not mean it's predictable. Chaotic systems are only predictable within boundary conditions. They don't "get into that state" They are in that state by definition.

A hurricane is a chaotic system. Hurricanes cause a lot of damage, and we spend a lot of money trying to predict them. We monitor them with satellites and aircraft and balloons, etc. We gather huge amounts of data and use supercomputers to model their paths.

When you look at the predictions, they are pretty good in the immediate future, but the predicted path is a fan shape, the uncertainty grows rapidly with time. That is the nature of chaotic systems (which is pretty much everything in nature).

Hurricanes are "deterministic", sure. Air obeys rules. Hot air always rises, air always flows from high pressure to low pressure, etc. But that does not mean hurricanes are predictable- we can never gather enough data or model it precisely enough to make anything other than a short-term prediction (that rapidly degrades).
You are not aware of the context of my remark, here is what transpired some weeks ago, my position will make more sense once you see what was being said - see this post

It began with me asking "If you get a predictable outcome then how can that be described as random?" now I knew I was talking to someone with a sound understanding of mathematics, more than I have, so I assumed he'd agree what I said.

He even ridiculed me by asking "So weather is deterministic? really?". It must be else we could never forecast it.

Anyone who asserts the weather isn't deterministic is either confused or living in some bubble or doesn't understand what deterministic means, doesn't understand what computable means.
 
Last edited:

Forum List

Back
Top