“Nature of the Laws of Nature”, discussion
Commenting the discussion we had today, I’d like to stress just a single point, which I think is one of the main stumbling stones for understanding of our (of Lev and myself) arguments.
At this year of 2014, the scientific thought of humanity is described by 45 orders of magnitude of its range and up to 12-14 decimal digits of agreement between fundamental theories and observations. This is what we call theoretizability/discoverability, cosmic observations and Pythagorean universe. From another side, there is a full-blown mathematical multiverse of Tegmark (“mathematical democracy”), suggested as a self-consistent atheistic answer to the question “why these laws, not other”.
Our statement is that Tegmark’s hypothesis is totally inconsistent with discoverability of that range (45 orders of the range with 12 digits of the accuracy). In fact, Alex Vilenkin had already pointed attention to that failure of the “mathematical democracy”, as I quoted him in my talk:
“Tegmark’s proposal, however, faces a formidable problem. The number of mathematical structures increases with increasing complexity, suggesting that “typical” structures should be horrendously large and cumbersome. This seems to be in conflict with the simplicity and beauty of the theories describing our world.”
In our paper we added one feature to Vilenkin’s diagnosis: we gave it a quantitative character, and we linked that to the current state of the fundamental physics. Here I will briefly explain that in a bit different manner compared with my talk and our paper.
According to Tegmark, our universe is a member of an infinite subset of universes compatible with the anthropic principle, allowing life and consciousness in that form we have in our planet. To be anthropic, the laws of nature have to lie within the anthropic width. It means, that the laws of nature must be close to ours with the anthropic accuracy: they have to have the same structure, the same fundamental constants with that accuracy. What specifically is this anthropic accuracy? Well, I may refer to the classical book of Barrow and Tipler where a lot of anthropic limitations are considered. Among them, the most stringent are at the level of ~0.001 perturbation, like for the proton to neutron mass ratio. For further considerations, I am taking that number as the “anthropic accuracy”. Those limitations which are more relaxed make my argument even stronger. If somebody would insist that some future analysis could show even more stringent limitations, I would not argue to consider, say 0.0001, which would not actually make any difference. So, I am taking 0.001 as the rough figure of merit for the anthropic accuracy.
Let us consider now an arbitrary universe from this 0.001 anthropic subset of the Tegmarkian multiverse, and let’s ask what sort of physics might be discovered there. Well, at the accuracy level of 0.001 or worse, the laws of nature of this universe are, by the definition, the same as ours. However, if physicists of that hypothetical world tried making measurements of their fundamental constants at better accuracy, they would realize that all the measurements are not reproducible at that level, they have sort of space-time noise with a relative amplitude of 0.001, driven by all those infinitely complicated terms of their true laws of nature. So, physics at that Tegmarkian universe would be stopped at the anthropic accuracy level simply because, with the probability of 100%, no reproducible measurement would be possible there with better than anthropic accuracy.
In our universe we have a dramatically different situation. Those of our elegant laws of nature which we we able to test at high precision, were realized to be valid with accuracy many orders of magnitude better than this anthropic width. For instance, measurements of electron magnetic moment were shown to agree with its theoretical prediction with accuracy of 12 decimal digits. An indirect observation of the gravitational waves demonstrated even better agreement with the General Relativity, at the level of 14 digits. Thus, many experiments which proved high precision of our elegant laws of nature, orders of magnitude better than the anthropic width, are refuting Tegmark’s hypothesis so definitely as nothing else.
If this conclusion is not clear for somebody, let me suggest an analogous situation. Imagine you are observing a guy shooting in a 1meter target from 1 meter distance, and a bullet after bullet are hitting the center with the accuracy of 1 micron. After a hundred bullets hit the center with that accuracy, you are considering two hypothesis. Hypothesis 1 (Tegmarkian): the guy is shooting randomly, this 100 central hits is just a random accident. Next time (which nobody seen actually, and the very existence of the “next time” is a pure speculation) the bullets would be randomly scattered at that target. Hypothesis 2: the shooter is of extraordinary ability. Isn’t it obvious that one of these two hypotheses is absolutely absurd?
More related ideas and details can be found in our paper “Genesis of a Pythagorean Universe”: