These observations correspond with my belief that physical reality is fundamentally spectral and random in origin. The traditional approach to this subject has been through analysis and therefore uniqueness. Notions of smoothness may be the wrong language to use here, with the Zeta function. For instance many phenomena in nature are fractal. As like the branches of a tree, they may exhibit glint, spikes or flash. Signals may be Analog or Digital, so signal processing provides a very general language and does not necessarily require a classical spatial continuum in its description. Indeed, Weierstrass, the father of modern day analysis postulated a continuous function that is no-where differentiable. A recent paper by S. C. Woon has indicated that Riemann Zeta function is a fractal and that through Voronin's Univerality encodes The Sum of All Theorems.
As a convolution of signals, everything maybe reduced to possible combinations of Signal + Noise in allowable configurations with effective Signal-to-Noise ratio – light, heat and sound are perceptions of waves and our brains are in fact wave-sensing devices. I suspect the total number of photons has to do with allowing the different features in scale to be seen from Quantum (very small) to Euclidean (in everyday experience) to Cosmological (very large). This includes the large-scale modulation (large wavelengths) that we simply don’t notice. It maybe that a critical amount of quanta is needed – Googol =10^100 say, to cause this as seen on the Dyson-Montgomery information curve. THERE IS NOTHING MAGICAL IN ALL OF THIS IN THAT IT IS ONLY THE AMOUNT OF POSSIBLE INFORMATION IN THE UNIVERSE THAT GIVES RISE TO ITS FEATURES OF SCALE - THAT IS ALL THAT THERE IS. The modulated Rice function is just an information curve – of all possible information. The primes represent unique ‘infon’ states albeit through a complicated relationship to the encoded Zeta function. The primes represent the minimal bit length of possible information that can be compressed19 as they can't be factored down into smaller integers (otherwise they would eventually be broken down to nothing).
Zeta effectively simulates the ensemble of all physical reality, and the reality is itself quantal, although our brains are not consciously aware of this level of reality.
The RICE distribution contains a Modified I0 Bessel functions (of the 1st kind – zeroth order) multiplied by the Rayleigh density, and my modification is extended to be RICE distribution multiplied by a combinatorial sum which is the sum of all possible signals.
Computation in Zeta may not have yet revealed the cosmological characteristics (viz. the Rice modulation of all combinations of signals) in the statistics of the Montgomery-Dyson curve, because of the current limits to bit capacity, and that bit itself is deterministic (quantum computing is still in its infancy).
The cosmological implication to all of this could be that the ‘Big Bang’ is still happening but only at larger scales; the spike could be that of dark energy – the missing 73% or so that pushed out (area under the spike). The outgoing information 'shock wave' carries on for eternity. The dark matter are seen as ripples in the Montgomery-Dyson curve, and imprinted as the gravitational rotation curves (Reference Vera Rubin) associated within the space curvature across large scale galactic structures. Galaxies may have super massive black holes at their centers, such as 'The Great Annihilator' at the center of our Milky Way. They have the mass of millions of suns, and it has even been suggested are the gateways to other universes (via Rosen-Einstein bridge) by forming a neutral force ring between the central singularity and centrifugal rim. These lead to some important questions: Are black holes described by thermal phenomena limited to the left hand side of the curve? Could it be that black holes, as well as physically essential to stability, are demanded to logically exist?
What we see in our observable Universe is the small-scale debris left behind after a massive ‘shock wave’. If we increase our field of view looking out on the cosmos using some super telescope, SNR increases on the Dyson information curve (where the mean signal-to-noise ratio is normalised; <SNR>=¼). A constant position in scale in an expanding universe implies eventual thermodynamic ‘heat death’ as the noise-limited case is approached in time (increasing entropy), while the farthest reaches are seen to be accelerating away (negative tangent beyond the singularity at π/2). For 'Sum of all Possible Signals + Noise' , read as 'Sum of All Possible Universes'. This may help explain Hawking's 'No boundary' proposal, in that the universe is infinite.
This negative pressure field may represent the inflationary phase of the universe where matter and radiation were not yet separated, but the universe become transparent when crossover occurred to positive pressure, and gravity started to cluster mass objects. The Casimir Effect may affect the energy of the vacuum to be actually manifested in the resonances of the Riemann Zeta function in the far limits up the Critical Line.
Thus in effect:
The sum of all signals, where each signal is noise modulated with infinite detail, gives rise to physical reality.
Zero point energy is seen at these higher cosmological scales with some infinite bandwidth by Fourier transforming) from an unlimited information/energy tap from the ‘big bang’ spike in the ‘time' domain. This originates from the modulated tangent in the RICE probability density function and it may be the means why our Universe evolved from ‘Nothing’ (+d –d =0, where d® ¥), with a mirror image of an antimatter universe. It is therefore nilpotent at the zeros.
Renormalization makes the area under the spike finite. We'll discuss this next:
In recent months, I've had time to further reflect on some fundamental and probably unanswerable metaphysical questions within our present language. Partly based on my findings, I propose a possible programme for others to run with. I've put these in as bullets for discussion and allude to some of them:
Prime numbers code the fundamental units of information. Information is the fundamental unit of Physics.
Energy maps to information in some essential way viz. you need energy to store information at the quantum level. They are synonymous.
I've thrown away the baggage to assume only the following: Physical Reality is built on information. In our observable Universe, information cannot be created or destroyed. Everything can be represented as some combination of signals + noise
A final theory must not be concerned with continuous fields, not even with space-time, but rather with information exchange among physical processes.
The 'unreasonable effectiveness' of mathematics at explaining physics, maybe that maths, physics and evolutionary biology may emerge from the same information 'engine' (Peter Rowlands, Liverpool University & Peter Marcer, BCSCMsG). In Rowland's book 'Zero to Infinity', he proposes that the zeros are the manifestation of a nilpotent rewrite system NUCRS. This has serious appeal of assuming literally starting from zero - Occam's Razor driven to the extreme, from which all meaningful mathematical semantics emerge. It also also avoids esoteric mathematics, that the 'man in the street' can understand, so in some respect is highly relevant to perception and experience as well as being very intuitive. Coupled with the Shannon approach to information processing with Signals + Noise as the actual physical manifestation, it is a plausible 'Theory of Everything'.
Could a prime number spectral analyzer ever be built? What is the impact on key encryption? The Prime numbers can not be generated by any kind of formula as RH is unprovable, as they represent new information with increase in size.
In a recent Horizon program Alan and Marcus go forth and multiply shown on BBC2 at 9pm on the Tues 31/03/09, Marcus du Sautoy shows that Riemann zeros have 'universality' in a demonstration at the National physical Laboratory. A heavy object is thrown at a lump of quartz, the resultant emission spectrum of the quartz is analysed. Amazingly, the spectral lines have remarkable similarity with the Riemann Zeros, which du Sautoy says 'is beyond coincidence'. This is actually the piezoelectric effect, converting mechanical into electrical energy and is the basis of Sonar.
Mean Signal-to-Noise ratio SNR. With no restriction, S £ N or S > N equally likely so mean [S/N] or <SNR>= 1. Incredibly, mean signal-to-noise ratio A2/2σ2 of 1/4*unity in the MRice4 distribution gives stable, flat infinite bandwidth in the Montgomery graph (the ¼ being the holographic factor). All other values not equal to one are unstable. This may indicate stable galaxy formation.
Thus Effective <SNR> = 1/4 (i.e. a quarter).
The Bekenstein/Hawking formulation. Black hole entropy is A/4, where A is the surface area of the black hole. The entropy of the black hole is defined by its area not its volume (log V) - or its capacity to stuff information within it. This gives rise to the Holographic Bound.
Thus Mean [Signal] = Mean [Noise] * 1/4 for stability, if one plays tunes with the Equation. A cutoff (bandwidth limit) is imposed by our collective intelligence, so we always have noise beyond our limits (Nyquist) of signal sampling in our treatments. The cutoff allows the information to be partitioned into discrete sample bins rather than continuous variables. Is it really all signal?
Array processing techniques such as beam forming and replica correlation (against original signal) are not only analogous, but physical manifestations of techniques that operate on elements within random matrices.
The relationship with prime numbers makes the essential information invariant whatever the information processing. An analogy is the same film recorded on both DVD and Video VCR. The format of the data, and the way the data are processed is completely different with each of these platforms, although the viewer would watch and experience pretty much the same film (within variations of sampling and picture quality).
The information paradigm provides a description of MIND + MATTER - The Mind-Body problem. Can thoughts and dreams be described within a signal processing framework? Is the brain a quantum computer, and could it ever be replicated by machine? It has been suggested (Roger Penrose - The Emperor's New Mind & Shadows of the Mind) that neurons contain special microtubule structures that perform transmissions and quasi-computations within the intermediate scale between classical and quantum mechanics (QM).
The Sonar detection analogy may actually be more universal as a deep psychological model. The Pf (Probability of False Alarm), Pd (Probability of Detection) and mean signal-to-noise (SNR) constitute a triad, where two of them uniquely determines the third. Human beings have evolved as experts in recognizing patterns from incomplete information - but this comes at a price. By increasing Pd (our ability to discover patterns), we increase Pfa (our capacity for delusion). Maybe the insistence of pure mathematicians setting Pfa to be zero, has hindered their ability to discover new territory. Indeed the whole concept of risk aversion is a serious concern in Physics to the likes of Smolin "The Trouble with Physics" - yet some theoreticians cannot provide direct (or for that matter indirect) evidence of some of the proposed mathematical models (for example, super-symmetry and String Theory). Indeed, the cleverness or the beauty of mathematical model is not a guarantee that it is TRUE.
The Universe appears to contain fractal structures repeated randomly at all scales like branches of a tree, down to a level of quantization, below which the reality is impenetrable. Is the quantisation20 seen as pure coherent signal in the energy levels of atoms at whole numbers of level spacing? Compare the Montgomery -Dyson curve at the lower levels in Section 5. Do the teeth look like the Lyman/Balmer Series21 in Hydrogen atoms? First glint at the first n=1 quantization, where n.l =2p.R, where n=1, 2, 3 ... No surprise that stars are made of Hydrogen in their purest form, and that galaxies are where all the elements are forged. This baryonic matter accumulates or condenses in the troughs of the Montgomery-Dyson information curve.
Take the square root of the ratio of the mass of a neutron to the mass of a electron = 3/[Öj - z(3)]. This is an interesting relationship (Eaglemann Prize) between physical atomic constants that involve the Golden Ratio j ((Ö5+1)/2~1.618..) and the Riemann Zeta function evaluated at 3. The former involves fractals, continued fractions (Ramanujan-Rogers identities?) and aesthetic geometry; the latter encodes prime numbers. Is this numerology or meta-mathematics? Gilson relates the fine structure constant from Physics - polygons with pi , and the electromagnetic coupling constant. Nested structures within nested structures.
G and h (Gravitational and Planck's Constant respectively) - are they also explicit functions of pi p or at least implicitly dependent? If the sum of all possible signals is directly connected to p, we may be able to integrate the Modulated Rice to derive a relationship using the ratio of dark energy and renormalization, N.h = 1, to allow for cutoff, where N is the total number of monadic quanta.
Do massive galactic black holes look like atoms when observed from the vantage of larger scale? Do they emit dark matter i.e. ripple outward space.
Does information processing negate String Theory? What of the Standard Model?
The Big Bang is an eternal process, at ever increasing scale in a universe with no boundary coming through as a 'point of light' of all information contained. Can the Big Bang ever be reconciled with a kind of (dynamic) Steady State of never ending expansion - with scale invariance rather than length invariance? Can we ever surf the information wave or will 'heat death' ultimately prevail and we slide back relatively on the curve?
Can new technology - anti-gravity, flying saucers, spaceships, spinning magnetic fields harness the energy of the vacuum ? Could this provide the unlimited thrust to travel to the outer reaches of our universe and allow us to escape this fate? UFOs, electrogravitic propulsion?
Are rotating black holes sources of (unlimited) dark energy? Black holes may in fact be dark energy stars as they require an infinite time to form, but actually theoretically evaporate in a finite time.
Does the Riemann Hypothesis relate (indirectly) to Loop Quantum Gravity (Lee Smolin's "The Trouble with Physics" - in particular see his section on how science is done )? Is there a One-to-Many map of Mathematics-to-Physics, or are they the same?
Hologragphy in black holes that give information on the outside 2-d surface mapping one-to-one the 3-d (+gravity) on the inside depending on one's relative position on the "God prime spike" within an AdS/CFT framework. Does the singularity at p/2 in the tangent function represent some kind of a "cosmological horizon" between the inside and outside of our universe.
Are such discontinuity man-made due to our partitioning pieces of nature's jigsaw to manage concepts such as mass, energy and momentum? Does signal processing describe a more fundamental level of physical reality? These are loaded questions. Does the sum of all possible signals, set our universe uniquely - Probability to Actuality across the singularity by analytic continuation?
The Yang-Mills Mass Gap. Does light have mass? Both, YES or NO, 1 or 0. Does it dwell in the Sum of all possible signals? Within the Axiomatic framework, is this a meaningful question within the framework of the language of Physics and therefore Mathematics - or does it exist outside the Universe of Mathematics? Does the Schrödinger's Cat experiment have anything sensible to say about this? LHC - will they detect anything useful? Are detected particles artifacts?
If Information is proportional to the
surface area A of a (galactic) black hole I = kA = k'E = k''M, k's are
constant of proportionality, M and E are the total Mass and Energy content
of the black hole. Then by Hooke's Law argument for a test mass m, the
= K.m, where rh is the radius of the bh Horizon. The factor of 2
is that of the gravitational field doubling up on the one sided hologram in
order to preserve the information. Eliminating M on both sides of the
equation we obtain constant K = c 2 to get the binding energy E = M c 2 .
Therefore is all matter created in the Galactic black hole to be imparted
with Einstein's relationship?
Gravity is non-baryonic in origin - it leaks into our universe; in parallel but distinct from the electro-weak and nuclear forces associated with matter. The modeling of energy levels in atoms has no 'handle' for gravity; is this the missing Higgs particle? l ® ¥ gravity modulates space as a carrier wave as seen in the ripples. Gravity becomes weaker (in time) and cannot brake the accelerating universe.
Big Bang as an eternal process and expansion outside requires less instruction than all the processes that go on inside - is there any kind of analogy with serial and parallel computation respectively as suggested by Seth Lloyd/Y. Jack Ng? Super galactic black holes may be like massive hard drives. Is the information lost at (and beyond) the event horizon? Is the transfer of information, and therefore determinism, always limited by the speed of light?
Is physics described by an information interface of relationships between physical processes rather than the physical entities themselves?
Black holes are associated with stupendous amounts of entropy. Is this a natural connection to the primes?
Are black holes actually dark energy stars?
Are the sum of all Signals + Noise really random? Is this maximum storage of information i.e. entropy?
Is dark matter the imprint of the side lobe ripple formed from the God spike discontinuity? (Note, the sidebands are features of what is known as the Gibbs phenomenon, due to the Fourier Transform over a discontinuity, in this case the God prime spike).
Are formations of galaxies explained by unraveling the ensemble of all signals and noise around their central massive rotating back hole.
Are galaxies the residue at the back-end of the God spike? Dark Energy and Dark Matter co-exist as opposite sides of the same coin, whatever the evolution.
Dirac's formulation of negative energy.
Are these Boltzmann spikes the 'boot-up' of reality and the 'kick start' of life in our own internal model of the world when we are born.
The laws of Physics serve as approximations and models of reality but NOT reality itself. At its most fundamental level, reality must be information-based down at some Level X (whatever X may be), and not the compartmentalized world-view of General Relativity or Quantum Mechanics which contain only some of the essential elements.
COBE measurement Planck curve @ 2.7K = Maxwell/Boltzmann in 3d which is equivalent to Rayleigh distribution in 2 dimensions. The black body formulation and the derivation of the 'gas in box' Maxwell distribution of particle velocities are totally different, but the shared commonality is maximum entropy and therefore maximum information.
Helium 2 at this low temperature range 2-3K exhibits a weird behaviour. It appears to become a super-fluid, climbing up the sides of containers, seeping through the smallest gaps, even defying gravity until the levels of fluid in each partition of the container becomes the same. There is no friction, no viscosity and gravity does not appear to interact with this strange quantum state. So the only left is information, and it must try to balance the information to get an equipartition. Helium is also the primordial end product of nuclear reactions in stars, where the conversion of Hydrogen to Helium provides the engine, and of course, large enough stars will eventually go on to form black holes.
In the semi-classical limit h ® 0, the transition from quantum to classical chaos description is highly singular. Does this represent the crossing of the 'Boltzmann' God spike from signal (finite bandwidth) to chaotic (infinite bandwidth) respectively? The Planck signal A = n.h becomes diminished so the God spike ® Rice (S+N) ® Rayleigh distribution (noise-limited) as the SNR becomes zero. Does this come out as thermal noise in our universe as an event, when the 'dice has been set', rather than one of an infinite set of probable outcomes in the multiverse?
Semi-classical limit. Decoherence and the effect of decoupling the environment, h ® 0 with emergent phenomena at the macroscopic level.
The End of Time T ® ¥ is when the distant observer on the outside of the black hole sees the object (the unfortunate astronaut say) crossing (or rather hovering at the asymptotic limit of) this singularity at the event horizon and increasingly becoming red-shifted; leaving our universe to the multiverse of all possible signal plus noise. This is a transition from classical chaos to the realm of probability (quantum theory). (In his own private frame of reference, the astronaut will cleanly cross the event horizon within what appears to be normal finite time, without noticing anything unusual assuming he survives and is not fried by heat or radiation). Is the asymptotic limit T ® ¥ at the boundary of the cosmological horizon looking in at the 'multiverse' of all possible information.
Is the Universe the logical consequence of observership? Is observation essential to its existence, as opposed to the Platonic view point? THE BASIC REALITY IS INFORMATION BASED ON SIMPLE ETERNAL LOGICAL RULES- that's all. Physics - mass and energy are simply emergent incidentals.
Is gravity instantaneous? If the sun were to magically disappear, would it take 6 minutes or so (at the speed of light) to see the effect? Or is this 'thought experiment' itself paradoxical?
How is common experience shared through differing perception? Are these perceptions as real as physical objects such as tea cups, rocks, dust, stars and galaxies?
Consciousness passes through us sentient beings. Maybe it passes through, but not from the Big Bang, forever as a (closed) loop singularity in an eternal process, forever rearranging 'stuff' at ever increasing scale. Is this the Sum of All Possible Signals plus Noise?
Is the principle of information optimisation used by living creatures and biological systems to gain evolutionary competitive advantage, as information processing within brain and nerve cells, and within DNA molecules. DNA expresses itself through maximum diversity i.e. entropy.
A → B → C
If we want to get from A to C, but our impasse is B. (The Symbols can describe either an object or process). Our capacity to either break through B is to either work harder or to circumvent it. The former approach involves specialisation. The latter involves diversification. Both are important, but I believe that the tendency in some areas of Mathematics has been to concentrate on the former with less consideration given to the latter. With the latter, processes are actually seen in evolution so that certain species become successful and adapt. It is called the Principle of Maximum Diversity and it obviously works. In fact it is the most optimal process in nature.
Smolin  refers to 'Seers' are the people who ‘think outside the box’, who in order to break free from the ‘maze of knowledge’ have to occasionally break the ‘rules’. The other group Smolin refers to are the craftspeople are those who insist on rigour and mathematical techniques to refine and consolidate – the mopping up operations which follow the initial wave of discovery, which in itself is important within Normal Science. Both activities require different motivation and psychological behaviours – non-conformism: independence of mind, making mistakes by trial and error, continuously improving and lateral thinking against conformism: the need to seek approval, elimination of error, finality and technical skill respectively. The ‘maze of knowledge’ refers to the extent of established truths – blind adherence to the rules will mean forever being trapped in the maze never discovering anything new. I agree with much of Lee Smolin’s theme in Chapters 17 & 18  and will not repeat what he says, but wish to elaborate further. Smolin himself is not an outsider, but is a product of the higher end of the University System so his arguments appear parochial in a sense. Also his ideas are less controversial than what first appears. In fact the conversation goes well beyond the Academy:
Universities have always traditionally
been the source of new ideas.
Entrepreneurs take novel ideas to find possible opportunities or gaps in the
Industry applies these ideas to develop and
produce real products, with the resources to take to these to the market
entrepreneurs may go on to create the ‘Microsofts’ and ‘Googles’ of
The beneficial outcome result is wealth creation and jobs. Universities can both meet the demands of
industry and allow free-thinkers and it is probably best that it does to get
maximum benefit. Indeed there are many successful examples of collaboration.
As Smolin suggests, they should also make room for outsiders, people who may
not fit within the traditional academic career pattern of exam success and
record of publishing papers, as well as people who can bridge over into
industry, have contacts with sponsors or have commercial know-how. While tolerating free
thinking, it should be reined in when necessary. After all, you couldn’t
run a tightly controlled operation like a warship that has tried and tested
ways of doing things, if everyone were doing this – it would be a total
The beneficial outcome result is wealth creation and jobs. Universities can both meet the demands of industry and allow free-thinkers and it is probably best that it does to get maximum benefit. Indeed there are many successful examples of collaboration. As Smolin suggests, they should also make room for outsiders, people who may not fit within the traditional academic career pattern of exam success and record of publishing papers, as well as people who can bridge over into industry, have contacts with sponsors or have commercial know-how. While tolerating free thinking, it should be reined in when necessary. After all, you couldn’t run a tightly controlled operation like a warship that has tried and tested ways of doing things, if everyone were doing this – it would be a total disaster.
The irony is that universities, in part because of their rigid hierarchical structures in the recruitment and tenure and the way conservative attitudes in administration of research funding is, over-emphasis on early academic success, tends to be risk averse, and so does not tolerate free-thinkers anymore. In fact, the Military (in particular the US), is probably more receptive in these matters and embraces free-thinkers in certain research streams (DARPA). This is, in part, because of more generous funding, but also the military approach tends to be ‘can do’ and pragmatic, and realise the importance of cross-disciplines to get round problems and find innovative solutions within the military imperative. In fact, potentially the most fruitful research in university probably does not have a high profile or sufficiently funded. Non-conformity can lead to innovation in the right kind of setting, and is probably an essential ingredient.
The idea of a Platonic Realm that only mathematicians, as part of some kind of elite priesthood, were actually capable of touching seemed nonsense. I never bought in into this notion. In fact, I felt everything had to be based on information – the mixing of signals and noise. Everything can be represented as some combination of Signals + Noise and their interactions in accordance with the Rice distribution  e.g. atoms, tables, chairs, stars, human beings. In fact, everything in life we love and value is based on information in a sense.
Is there a place for God, are the primes God-given - or does it pass all understanding? Is our universe the only Anthropic one from all the Multiverse. Do they all just depend on pi? Or is there some cosmic fine tuning involved with effective SNR=1/4 , (σ=1/Ö2, A=½ say) in Rice4 distribution for stability, averaged over all possibilities?
This ¼ factor may have something to do with
holography. Remember, the entropy of a black hole of Area A is actually A/4.
TAN factor is like a scale compression ratio of projected length (SIN q) or aperture facing an observer to the (inverse of)
informational depth (COS q) in the Far Field. (TAN ºSIN/COS). Recently, I have developed this further (Refer
Section 10) and presented at CASYS
This organiation is
genuinely trying to break new ground, but still having to maintain the
approval of established Science. However, conflict and consensus is both how
Was the Universe constructed for our benefit? Are the properties of the Riemann Zeta function that special? Do the zeros follow the same process of maximization of possible information.
The Sum of All Possible signals is the most unconstrained definition with No constraints whatsoever. For example X > Y, X=3 are constraints. In its neat and tidy little definition, it contains the most information possible.
In the grand scale, the Universe may actually be 2-d, like a giant hologram. (Where does the 11 dim String Theory and supersymmetry fit into this?)
Square each of the sample bins of the spike to convert to PSD (Power Spectral Density). If the area to the left and right hand sides of the God Spike are in ratio [A1/A2] of dark energy/actual energy respectively and let A1, A2 ® 0 then the ratio limit A1/A2® L which the optimal information seen on a 2d surface namely L~0.74048 =p/Ö18 assuming the Holographic Principle. When A1, A2 =0 we are at the infinite, looking from a God-like perspective. (0/0 = ¥/¥ "To know everything is to know nothing". If the BITS are all ZERO, we are in darkness - we are blind. But what if each of the BITS = 1 i.e. all lit - are we still blind? Infinite string .....000000..... or .....111111.....)
The ratios (dark energy to dark matter +baryonic) are fixed for all eternity at the ‘End of Time’. ‘Playing tunes’ with my equation when positioning the RAYLEIGH-RICE curve, when σ2 ~ 1 with SNR =¼, we get maximum ‘face-on’ area with about a 72/27 ratio. (Try σ2 = 0.97 in the numerics, gives the exact energy ratio with maximum possible information i.e. maximum entropy).
Neutron star collapses as a possible mechanism for maximum information packing. This is the last structure before the matter collapses under its own gravity to form a black hole (if Mass > 2.5Mo solar masses). Does the neutron crystal lattice form FCC and leave the FCC trace behind on the hologram? Is each neutron within its own Schwarzschild radius - repeating the packing of hard spheres in FCC arrangements, continually drilling down the process forever?
Does the optimal packing of information (namely the Sum of all possible Signals) emerge as our Universe? I conjecture that the % of mass-energy that is dark energy = FCC packing efficiency (~74.048%). This is discussed in the next section on the Kepler Conjecture concerning maximum connectivity.
"I am Alpha and Omega, the beginning and the ending,.." (Revelation 1:8)
A special acknowledgement goes to my friend Ron Mantay who helped me construct this website. The pioneers of the internet have enabled an ordinary person like myself to disseminate my own findings and experience from a different subject perspective, that may be of interest to others. I am especially grateful to Matthew Watkins for including my paper to his site which is an audacious enterprise to connect Number Theory with Physics.
I have made 'clear blue water' between myself and the mathematical community for at least two years. I passionately believe that mathematical knowledge is for everybody and nobody owns it. There is a real hunger for understanding of the deep fundamental questions by people not necessarily associated with traditional academia, who work independently, but who have equally worthy contributions from their own professions and experiences especially in subject areas such as engineering and information science that may overlap. Often they can be easily dismissed and tarnished with the same brush as 'nutters' or 'cranks' . Individuals, hobbyists and amateur scientists & inventors are often self-driven, and may choose for whatever reason or be unable to be part of the other groups. They may feel excluded, they maybe ridiculed, yet the very exclusion may actually drive them. They may work in their garages or garden sheds and this country has a history of eccentric boffins. They may be highly creative, brilliantly successful or utterly bonkers, totally delusional and fail. However they should never be underestimated and excluded from the discussion of how Science moves forward as they often play a crucial role.
Over-specialisation, professional stereotyping, commercial self-interest, financial constraints, regulated environments, cultural boundaries, differing motives and a lack of understanding across groups have created knowledge 'tribes', between individuals and amateurs, academia and industry and within these organisations themselves , across departments and business units, between engineers and academics, applied and pure scientists . This calls into question how scientific progress is made and managed and about risk aversion in Science  - Smolin. Whether taking new directions within flexible career structures as opposed to going over the old, the safe and what is known . One thing is certain, the form mathematics is now taking is changing . Consequently the use of the internet is replacing traditional refereed journals .
have always happened. It can be uncomfortable and disconcerting for some, but
real shifts in Science are few and far between and in Physics has been a long
time coming. It may start happening at an accelerated rate. We are not use to
seeing this happen in our own lifetime especially with the complex structures,
professional, and institutional, that have now been built in recent
decades around existing paradigms.
The logical conclusion then is we need a representative body that encompasses all mathematicians – academics and practitioners, and represents their interests nationally and the interests of mathematical knowledge at large, that safeguards standards but also progresses the subject to take on the challenges of now and the future otherwise there will become widening gaps between academic and practitioners.
Indeed, I predict many new discoveries and opportunities will be made at the interface and in the gaps between specialities and disciplines, will need to become increasingly cross-disciplined, systems-engineering, especially in the rapidly developing fields of modern communications and signal processing, which are now flipping over into other areas including Theoretical Physics.
More importantly, mathematics should be fun, and we should never be afraid to test new, unfamiliar ideas even if they turn out to be wrong. For a hundred duff ideas, one may hit the jackpot. We should care about fundamental questions, but care less about what others may think, think for ourselves, be bold and challenge old assumptions otherwise we don't move on. Even experts can delude themselves that they are always right. Whatever the outcome of such theorems, one thing that is certain is that if they are to be resolved, it will be in a manner that is both controversial and unexpected.
Also I would also like to thank Professor Sir Michael V. Berry and Professor E. B. Davies (Kings College London) for their advice and direction and for giving me the time to listen and push my ideas.
I would also like to thank my parents, particularly my Father, friends, my wife, Portia, and family for their patience and trying not to appear bored with my mutterings. Also a special acknowledgement goes to former colleagues Robert Brown and Brian Knapp who in many conversations, provided a useful sounding board as well as the necessary scepticism.
All material on this site is property of Adrian Rifat.