In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-14 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > IT Information >
Share
Shulou(Shulou.com)11/24 Report--
"making bread" is one of the most mathematical things you can do in the kitchen, which is much more than measuring ingredients. When you fold and compress the dough, you are undergoing topological deformation. The kneading action constitutes an operation called Baker's Map in chaos theory, in which the mixing of a dynamical system is mapped from a system to itself and carried out many times.
Mixing is a chaotic, but not random, behavior. It occurs because of the nonlinear self-interaction in the system, which causes the system to completely reshape itself. After enough iterations, every possible combination of the system material will appear. This is very important for the field of mathematical physics called ergodic theory, which is the basis of statistical physics. Of course, this is not without restrictions. Atoms do not fuse or spread light-years away unless they are part of your definition system. However, mixing ensures that the dough not only achieves maximum entropy, but also that the large number of molecules in the dough goes through all possible arrangements in sufficient time.
Beckett in our universe, mixing is one of the most basic forms, far more than randomness. What does this mean?
The basic idea is that a Becker diagram that has been executed countless times is completely unpredictable, and all components are completely disrupted, thus becoming random.
When you add a random process like this to a deterministic process, it produces a random system, like a swinging spring or a ball rolling down a hill. Deterministic processes provide global motion, while randomness provides statistical distribution of motion. The randomness of random systems depends on the length range involved, but most systems have some random behavior.
The relationship between chaotic mixing and randomness is the basis of all statistical physics, but I think chaotic mixing is actually more basic. The reason is that we know that at least for classical systems such as gases, molecules are basic units. As a result, chaotic mixing does not occur below the molecular level. On the contrary, the thermal behavior of molecules is produced by a combination of kinetic energy and colliding other molecules. There is a good agreement between the molecular model and the experimental measurement.
On this small scale, the interaction between two (or more) molecules is definite and chaotic, not random. Therefore, the feature that we associate with randomness comes from deterministic chaos, not real randomness.
Another interesting feature of statistical physics is that randomness is not a key factor in making predictions. It is usually used to simulate and unpredictable thermal fluctuations, but if the basic components are not random, but simply mixed, statistical prediction usually works. Mixing is important, because mixing causes the element to end in the right configuration at the right time.
Take temperature or pressure, for example, which is the result of a statistical process. For temperature, it is important that the molecule provides the right energy in the right position to reflect the average kinetic energy of each molecule. And, as I mentioned earlier, this mixing must be fractal in nature. You need to see all levels of mixing occur so that no matter what your thermometer is, you will still see the same average temperature. But when you see the size of the molecule, it's over. If your thermometer is calibrated by a single gas particle, you cannot measure it correctly.
Why is this important?
Because I believe that the universe is fundamentally not random. God doesn't roll the dice. On the contrary, God is a baker who bakes from the cosmic scale to the Planck length. A little smaller, the baking would have stopped. Even when it comes to quantum phenomena, everything is just baking, but in another dimension, space and time itself are kneaded.
If so, then randomness is an illusion, and the model of randomness is a mixed approximation.
Kenneth Wilson Nobel laureate Kenneth Wilson may be one of the greatest heroes in statistical physics and quantum field theory. He developed a theory of length scales, showing us how to model physical systems on specific length scales and relate them to each other. From Wilson's legacy, we have gained a modern understanding of stochastic systems, especially state changes, such as water freezing or evaporation, or magnetization or demagnetization of magnets.
Thus, we can understand why physical systems mix not only on one length scale, but on all length scales, whether it is a molecular university or Planck length. It is precisely because of the "self-similarity" of these systems on different length scales, their behaviors are similar on different scales, but they have strong or weak self-interaction, which may make these systems more or less mixed on the same scale.
For example, some systems are smooth on a large scale but rough on a small scale. Others, on the contrary, are complex and chaotic on a large scale, but simple and predictable on a small scale.
The performance of air on a large scale is quite different from that on a small scale, which is why an aircraft can pass through the air with large fixed wings, while flies must buzz through the whirlpool. This has a lot to do with the viscosity of air, and sometimes it has something to do with the random movement of air.
The stock market is another example that is more predictable at long intervals than at short intervals. That's why you can invest in index funds, invest your money for 10 years, and be quite confident that you will make money.
However, confusion on one scale can also spread to another. This is the so-called butterfly effect. Wilson specifically pointed out that it was foolish to try to separate different scales. Instead, we must connect them to each other and deal with the possibility that the phenomena we are interested in studying depend on what happens in all lengths, not just one. Especially when a system approaches a so-called "tipping point", such as ice melting or magnetization, all scales come together to produce this phenomenon, causing cascades and fluctuating up and down in their way, so that something magical will happen and the system will be changed.
Similarly, in the financial sector, small changes can affect large trends and vice versa, reaching a tipping point of market collapse or prosperity. You can never be sure that what happens every second will not break out and will not affect what happens every month.
None of this assumes that randomness exists or does not exist. Instead, it simply assumes that mixing (random or non-random) occurs at many different scales and at different speeds. Randomness is of course convenient for mathematicians and physicists, but it is not necessary.
One of the benefits of assuming that the system is random is that it can eliminate all external factors that you don't understand. But as the basic model of the world, randomness is untenable. Stochastic models always rely on some mysterious entropy source. They do not allow the system to be completely closed and deterministic. Another feature of randomness is that it destroys some good mathematical properties of the system, such as differentiability.
If you have a particle, it fluctuates randomly, and its motion is non-differentiable according to some normal distribution. The reason is that in a very small period of time, the average distance it moves is proportional to the square root of time. Derivatives depend on infinitesimal time and distance to be proportional, so the small amounts in them are offset in proportion. But randomness is not the case. Unless randomness magically disappears within a certain range of length, it will fall into non-differentiable motion.
That's why quantum mechanics has non-commutative operators, which give different answers according to the order in which you apply them. This is the source of the Heisenberg uncertainty principle.
If we say that systems are never random, isn't it easier for them to be mixed only because of internal nonlinear self-interactions or unexplained interactions with external forces?
Maybe yes or maybe no. We have the whole distribution theory to deal with randomness. We use operator calculus to deal with randomness. We don't need to replace it with complex, multi-scale nonlinear hybrid operations to get things like differentiability.
But what if we really want to know what's going on inside? We can pay more attention to randomness, or we can question it and try to understand its limitations.
I was thinking about using Lindblad equations to deal with quantum mechanics, which incorporate randomness into quantum mechanics to explain why quantum measurements seem random. Since quantum predictions can work without them, why introduce them unless you want to say that God does play dice?
What if God is a baker?
This article comes from the official account of Wechat: Lao Hu Shuo Science (ID:LaohuSci). Author: I am Lao Hu.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.