Why should computational neuroscientists care about the distribution of prime numbers?
Motivation:
How could a reasonable scientist make sense of the astonishing developments in number theory over the course of millenia(dating back to Euclid and Eratosthenes) without a precise application in mind? What more, how did number theory singularly avoid developing into a baroque construct; the tragic fate of all pure sciences that aren’t moulded by practical applications? Yet, if not for these developments secure communications and modern cryptography in general would be nonexistent. It is as if the best musicians in history developed the fundamentals of jazz music without a particular audience in mind. Each generation of mathematicians passing down their craft in number theory to the next with a consistency and originality unmatched by any other branch of science.
The objective of this article is precisely to propose a reasonable explanation.
The distribution of primes and informationtheoretic mechanisms of human cognition:
From the Prime Number Theorem, we may infer that the number of primes less than \(N\) is given by:
\begin{equation} \pi(N) \sim \frac{N}{\ln N} \end{equation}
What makes the density of primes interesting for an informationtheorist is that the typical frequency with which a large integer \(N\) is prime is inversely proportional to the information gained from observing that integer \(K_U(N)\) multiplied by the constant \(\frac{\frac{d}{dN} 2^N}{2^N} = \ln 2\):
\begin{equation} K_U(N) \sim \log_2(N) \sim \frac{1}{\ln 2} \cdot \Big(\frac{\pi(N)}{N} \Big) ^{1} \sim \frac{\ln N}{\ln 2} \end{equation}
where \(\ln N\) corresponds to the average information gained from identifying a unique object distributed uniformly among \(N\) possible locations.
In addition, information theory and informationtheoretic formulations of human cognition such as the Free Energy Principle provide us with an ideal vantagepoint to clarify the nature of Godfrey Hardy’s remarkable insight:
A science is said to be useful if its development tends to accentuate the existing inequalities in the distribution of wealth, or more directly promotes the destruction of human life. The theory of prime numbers satisfies no such criteria. Those who pursue it will, if they are wise, make no attempt to justify their interest in a subject so trivial and so remote, and will console themselves with the thought that the greatest mathematicians of all ages have found in it a mysterious attraction impossible to resist.Godfrey Hardy
The second half of this quote is crucial as it appears that we may use informationtheory to address part of the mystery Hardy refers to. The informationtheoretic arguments presented thus far suggest that to a large extent the human brain is an instrument for data compression besides being a movement coprocessor. The implicit argument here is that good mathematicians maximise expected surprise(i.e. information gained).
Besides telling us something profound about the human mind, might this also reveal something important about the distribution of primes? It might, if we first make the reasonable conjecture that all of physics may be simulated by a Universal Quantum Computer. In light of this hypothesis, it is worth considering that a number of reasonable theories of Quantum measurement are not observerindependent. This includes the WignerVon Neumann formulation of the measurement problem as well as the ManyWorld formulation of the measurement problem.
Deeper investigations in these complementary directions may reveal a direct correspondence between informationtheoretic mechanisms of human cognition and the distribution of prime numbers.
References:

J. Hadamard, Sur la distribution des z´eros de la fonction ζ(s) et ses cons´equences arithm´etiques, Bull. Soc. Math. France 24 (1896), 199–220; reprinted in Oeuvres de Jacques Hadamard, C.N.R.S., Paris, 1968, vol 1, 189–210.

Aidan Rocke (https://mathoverflow.net/users/56328/aidanrocke), informationtheoretic derivation of the prime number theorem, URL (version: 20210220): https://mathoverflow.net/q/384109

Lance Fortnow. Kolmogorov Complexity. 2000.

John A. Wheeler, 1990, “Information, physics, quantum: The search for links” in W. Zurek (ed.) Complexity, Entropy, and the Physics of Information. Redwood City, CA: AddisonWesley.

Karl Friston. The freeenergy principle: a rough guide to the brain? Cell Press. 2009.

Hugh Everett Theory of the Universal Wavefunction, Thesis, Princeton University, (1956, 1973), pp 1–140