From: Volkmar Weiss []
Sent: Sonntag, 9. November 2014 17:30
To: ''; ''
Subject: The spatial metric of brain underlying the temporal metric of EEG and

Dear colleagues:

Not unexpected, the reaction to my former email was a weak one. Obviously, the idea that between the morphological structure of the brain and its informational code should und must be an optimal congruence, is not a common knowledge as it should be. Even not for Nobel Prize winners and its coworkers.

Decades ago I published a paper: Weiss, Volkmar: The spatial metric of brain underlying the temporal metric of EEG and thought. Gegenbaurs morphologisches Jahrbuch 136 (19990) 79-87, from which I cite on page 83: “The anatomical background of array processing by the brain.  … There must exist a correspondence between anatomical structure, anatomical array, and the velocity and wave-number of electrocortical waves.”

From this insight it was still a long way to go to the conclusion, supported by empirical data, that only the mathematics of the golden section (synonymously called the golden mean or the golden ratio) provides the basis for an optimal congruency between structure and code,  in this way satisfying Occam’s razor. There is an immediate relationship between the hexagonal grid structure and the golden mean, see for example the paper “Golden Hexagons” by Gunter Weiss

 Mit freundlichen Grüßen

 Volkmar Weiss


Dear Professor Moser, she and he:

(Dear colleagues, addressed under Cc:)

My congratulations to the Nobel Prize!  As I did see the hexagonal grid cells in the media,  you did discover in the brain, I was immediately aware of the relationship of this structure, with the golden ratio, 


 In 2003 I published together with my son Harald the paper “The golden mean as clock cycle of brain waves”,



There are already a number of technical applications of hexagonal grid structure and coding with Fibonaccis and hence the golden mean.

Therefore, we should not be surprised, if we would discover the way in which our brain uses the unique mathematical advantages of the golden ratio for coding and decoding of information.

Volkmar Weiss


Chaos, Solitons and Fractals 18 (2003) No. 4, 643-652 - Elsevier Author Gateway, online version

Short term memory capacity, Attention, EEG, Quantum computing

The golden mean as clock cycle of brain waves

Harald Weiss and Volkmar Weiss

Rietschelstr. 28, D-04177 Leipzig, Germany;

Accepted 20 February 2003; Available online 15 May 2003


Key words: Short-term memory storage capacity, neoPiagetian, cognitive development, IQ, processing speed, reading rate, power spectral density of the EEG, golden ratio, golden section, Fibonacci, quantum computer


   The principle of information coding by the brain seems to be based on the golden mean.  Since decades psychologists have claimed memory span to be the missing link between psychometric intelligence and cognition. By applying Bose-Einstein-statistics to learning experiments, Pascual-Leone obtained a fit between predicted and tested span. Multiplying span by mental speed (bits processed per unit time) and using the entropy formula for bosons, we obtain the same result. If we understand span as the quantum number n of a harmonic oscillator, we obtain this result from the EEG. The metric of brain waves can always be understood as a superposition of n harmonics times 2 F, where half of the fundamental is the golden mean F (= 1.618) as the point of resonance. Such wave packets scaled in powers of the golden mean have to be understood as numbers with directions, where bifurcations occur at the edge of chaos, i.e. 2 F  = 3+ f3. Similarities with El Naschie’s theory for high energy particle’s physics are also discussed.

The substantial role of the golden mean for brain waves has been empirically confirmed by: Roopun, Anita K. et al. (2008). Temporal interactions between cortical rhythms. Frontiers in Neuroscience 2, 145-154.


1.  Introduction

  “It bothers me that, according to the laws as we understand them today, it takes ... an infinite number of logical operations to figure out what goes on in no matter how tiny a region of space, and no matter how tiny a region of time. How can all that be going on in that tiny space? Why should it take an infinite amount of logic to figure out what a tiny piece of space-time is going to do? So I have often made the hypothesis that ultimately physics will not require a mathematical statement, that in the end the machinery will be revealed and the laws will turn out to be simple, like the checker board with all its apparent complexities,” wrote Feynman in 1965 (cited from [1, p. 638]). Wolfram [2], too, believes that there are quite simple mechanisms that underlie human reasoning. He asserts that the use of memory is what in fact underlies almost every aspect of human thinking. Capabilities like generalization, analogy and intuition immediately seem very closely related to the ability to retrieve data from memory on the basis of similarity.

   Already in 1966, Kac [3] had put forward the question: Can one hear the shape of a drum? In order to find an answer, Kac asks for the energy in the frequency interval df. To this end, he calculates the number of harmonics which lie between the frequencies f und df and multiplies this number by the energy which belongs to the frequency f, and which according to the theory of quantum mechanics is the same for all frequencies. By solving the eigenvalue problem of the wave equation, Kac is able to state that one can not only hear the area of a reflecting surface, its volume and circumference, but also the connectivity of paths of an irregular shaped network. If the brain waves had the possibility to measure and hence to know the eigenvalues of a spatially distributed information amount, they would have nearly perfect access to information and - in terms of communication theory - perform nearly perfect bandlimited processing. As we know, the eigenvalues are proportional to the squares (i.e. variances) of resonant frequencies [4].

   The question whether brain waves reflect underlying information processing is as old as EEG research itself. Therefore, relationships between well-confirmed psychometric and psychophysiological empirical facts [5] and EEG spectral density are very interesting.


2.  Memory span as the quantum of action of thought

     Ever since attention became the object of scientific study, psychologists have recognised that it possesses a quantitative dimension in terms of the maximum number of items to which a person can attend at one time. It now seems almost universally accepted [6] that short-term memory has a capacity limit of seven plus or minus two [7]. The possibility that such quantitative limits on attention span might be related to qualitative differences in thought and reasoning was recognised by Piaget [8]. Beginning with Pascual-Leone [9], the prediction of children’s reasoning from estimates of their memory span has been a major goal of neo-Piagetian theories of cognitive development. Halford’s [10] research has led to the conclusion that the best metric for processing capacity is the complexity of relations that can be processed in parallel.

   In a typical Piagetian class inclusion task, children are shown a collection of objects (e.g., wooden beads), most of which are of one colour (e.g., red) and the rest of another colour (e.g., white). Children are asked if there are more red beads or more wooden beads and are credited with class inclusion if they indicate that there are more wooden beads because the red beads are included in the total class of wooden beads. Under the assumption that each simultaneous value assignment requires a unit of capacity, the operation of class inclusion would require a minimum of 3 such units that means a memory span of 3. It was shown by Humphreys et al. [11] that a total score on 27 Piagetian tasks was very highly correlated (r = .88) with the 14-item Wechsler IQ test. From only 13 Piagetian tasks Humphreys et. al. could form a test that is an excellent measure of general cognitive ability in its own right but can also add to the information furnished by Wechsler Verbal and Performance IQs and academic achievement. Piagetian tasks and ordinary IQ test item differ only that in Piagetian tasks this minimum of memory span to solve the task is known, in ordinary tests not or not explicitly.

    Pascual-Leone understands memory span as the maximum of discrete and equal energy units (i.e. quanta), which every subject has at his disposal. In the first step of Pascual-Leone`s experimental procedure all subjects learned a small repertoire of stimulus-response units. The responses were overlearned motor behaviours such as: raise-the-hand, hit-the-basket, clap-hands, etc. If a subject has a memory span of 5 and it has to keep in mind a memory set of 5 elements, he cannot arrange element 1 corresponding to span or attention space 1, element 2 to span 2 and so on. This will be impossible. Because access to chunks in working memory is random, the available energy quanta are not distinguishable and have to be defined as bosons (i.e. indistinguishable quanta). By applying the Bose-Einstein occupancy model of combinatorics to his learning experiments with children of different age, Pascual-Leone obtained a very good agreement between empirical probabilities and Bose-Einstein predicted theoretical probabilities. Weiss [for detailed statistics see 12] calculated from Pascual-Leone’s sample of 11.8-year-olds a mean information entropy H of 86.4 bits. A mean IQ of 119 for 11.8 year-olds corresponds in performance to an adult IQ of 102 for about 40-year-olds. In tables of IQ test results edited by Lehrl et al. [13] and based on concepts of information theory (see below), we read for this age and IQ 102 a short-term memory storage capacity of 84 bits. Two approaches with seemingly completely differing theoretical starting points lead on the absolute scale of information entropy to practically the same result. For Pascual-Leone’s data the latter result was even obtained after applying quantum mechanics twice in series, for calculating Bose-Einstein statistics and information entropy.

    The variance of the Bose-Einstein distribution equals m2 + m, where m reflects the granularity of the energy due to Einstein’s photons (cited from [14, p.189]). If we set the variance 1 and m = x, we get x2 + x = 1. The solution of this equation is f (=(Ö5-1)/2 = 0.618033), the golden mean [1] . Its inverse 1/f = F (also called the golden ratio, the golden number, the golden section or the divine proportion) has the property 1 + F = F2. Therefore the double geometric F-series:

..., 1/F2, 1/F, 1, F, F2, F3, ... .

has the properties,

           ..., 1/F2 + 1/F  = 1, 1/F + 1 = F, 1 + F  = F2,  ...                                (1)  

 and is thus a Fibonacci series. It is the only geometric series that is also a Fibonacci series. Essential is the fact that the fractional parts .618033... of f, 1/f, and 1/f + 1 = F2  are identical. The title chosen by us refers to this golden mean in the broader sense.

   Forces are now recognised as resulting from the exchange of huge numbers of discrete particles, or information patterns called vector bosons, which are exchanged between two or more particle information patterns. The absorption of a vector boson information pattern changes the internal oscillation state of a particle, and causes an impulse of motion to occur along a particular direction. This turns out to be the quantum origin of all forces. Therefore, forces can be thought of being digital rather than analogue.

   In 2001 Bianconi and Barabási [15] discovered that not only neural networks but all evolving networks, including the World Wide Web and business networks, can be mapped into an equilibrium Bose gas, where nodes correspond to energy levels and links represent particles. Still unaware of the research by Pascual-Leone, for these network researchers this correspondence between network dynamics and a Bose gas was highly unexpected [16].


3.  The information entropy of working memory capacity

    Shannon’s information entropy H is the logarithm of the number of microstates or patterns consistent with our information. To reduce the information to this scalar measure (evaluated in bits of accuracy gained) is to reduce the form to its topological complexity. Each global state of a network can be assigned a single number, called the energy of that state, i.e. the differences in the log probabilities, and hence information entropy of two global states (thoughts) is just their energy difference.  By extension of Shannon’s concept of channel capacity [17], in 1959 Frank [18] had claimed that cognitive performance to be limited by the channel capacity of short-term memory. He argued that the capacity H of short-term memory (measured in bits of information) is the product of the processing speed S of information flow (in bits per second) and the duration time D (in seconds) of information in short-term memory absent rehearsal.


                                            H (bits) = S (bits/s) x D (s).                                           (2)

According to Frank the mean channel capacity follows a lognormal distribution [19], where 140 bits correspond to IQ 130, 105 bits to IQ 112, and 70 bits to IQ 92.

   The first experimental approach to determine mental processing speed in bits per second was accomplished by Naylor [20]. His method of testing enabled the subjects to present to themselves a stimulus which remained as long as they kept a finely balanced switch depressed. The stimuli were digits between 1 and 9 or numbers between 1 and 32 presented singly or in groups of two, three, four, or five. By this procedure the time was measured until the signs were perceived by the subjects. The information content of one digit of the repertoire of nine possibilities was 2 3.17 = 9. That is, 3.17 bits. Recognition of one of the 32 possibilities (= 25) was equal to 5 bits. Thus,  Naylor measured not only the time between stimulus and reaction but also the amount of stimulus information. This is the prerequisite for the more striking observation by Lehrl and Fischer [21], that the results (in bits/s) are numerically equal although the repertoires if signs differ. The measurement of stimuli and reaction in terms of the information unit (the bit) und physical time will only reveal properties of the subject if the information content of the objective repertoire agrees with that of the subjective repertoire. When a repertoire of signs (such as letters, digits or chunks) is overlearned, independently presented signs, whether of sense or nonsense in common usage, have the same objective as subjective information.

    Instead of applying one of the elementary cognitive tasks already mentioned, Lehrl et al. operationalised Frank`s concept of short-term memory storage capacity (in bits) by testing memory span and reading rate. The subject is simply asked to read a series of mixed up letters in an undertone as quickly as possible. As soon as the subject begins to speak, the stopwatch is started. The time from the first to the last spoken letter is measured. It should be documented in tenths of a second, e.g., 7.3 s.  When evaluating the raw scores it must be remembered that a subject can only perform full binary decisions. Therefore, the recognition of a letter out of the repertoire of 27 letters, which theoretically has an information content of 4.7 bits (26 = 24.7) needs five binary decisions. Since each letter contains 5 bits of information, the 20 letters contain 100 bits. This is divided by the time of reading to obtain the amount of information processed in a second S (bits/s). For example, if the best time of a subject is 7.3 s, then S = 100/7.3 (bits/s) = 13.7 bits/s. By standardising letter reading on adults, normative data are available (see Table 1; column mental speed). - see: The Basic Period of Individual Mental Speed (BIP)

    Forward memory span D can be predicted on the basis of the number of simple words  which the subject can read out in 1.8 seconds. Regardless of the number of syllables, any subject in an empirical investigation by Baddeley et al. [22] was able to recall as many words as he could read in 1.8s. This result can easily be confirmed by the normative data from Lehrl et al. For example, for IQ 100 holds: The 20 letters of their reading task are read in 6.6s; D (memory span) corresponds to 5.4. Now we can calculate x = 6.6s x 5.4 / 20 = 1.8s. Hence, span and processing rate are both measures of the same working memory system [23]. The greater the memory span, the faster the processing rate. The time required to process a full memory load is a constant, independent of the type of material stored.

    The overall importance of reading speed in everyday life and as an indicator of processing speed is obvious. With increasing age, children name familiar objects more rapidly, and these naming times are related to reading ability. Greater memory capacity is associated with greater reading recognition skill, and the same comprehension processes underlie both reading and auding. The fastest rate that individuals can successfully operate their reading and auding rate is limited by their thinking rate. Consequently, there is an inverse relationship between the length of words and their frequencies of usage. Because words are stored in neural networks, the discovery by Bianconi and Barabási [16] reveals the deeper meaning of Zipf’s and Pareto’s power law [24] by which the size of the vocabulary of a given individual can be understood as a function of his memory span n.   


4. Memory span and EEG

   During the last decades a number of authors have claimed not only correlations between memory span and mental speed, but also with electrophysiological variables of the EEG. In 1935, Gibbs et al. [25] had already documented that patients (sample size was 55) with petit mal epilepsy show, in all cases during seizures, an outburst of spike waves of great amplitude at a frequency of about 3 Hz. The fact that such seizures can be aborted using brief stimuli is very suggestive of an underlying multistable dynamical system. This finding is part of our confirmed knowledge and can be read in every textbook on EEG or epilepsy. From this Liberson [26] had drawn the conclusion that all significant channels in EEG could be n multiples of one fundamental frequency of about 3.3 Hz. According to his empirical data the number of these multiples (harmonics) is nine as the maximum of memory span (see Table 1). Assuming these numbers one to nine to be quanta of action (as Pascual-Leone did), we again obtain a relationship between the classical formulae of quantum statistics and empirical results of both EEG and psychometric research.


Table 1  Memory span (corresponding to the number of an EEG harmonic), frequency of  EEG harmonics and mental speed and their relationships with information entropy, power density of short-term memory storage capacity, and IQ
























E = nf




E = n2 2F


n x 1s
















































































Column b: empirical data from Liberson [26].

Column c:  product of column b times n.

Column e:  product of column d times n.

Columns a, d, f and h: empirical psychometric data from Lehrl et al. [13].

Their sample size for standardising the test was 672 subjects.

Column g is purely theoretical.



   Assuming the numbers 1 to 9 of memory span to be equivalent of harmonics in the sense of wave theory, the power spectral density E is given by the eigenstate energy-frequency relationship E = nf (kT x ln2), where f is frequency. According to thermodynamics, the measurement of 1 bit of information entropy [27] requires a minimum energy of 1 kT x ln2, where k is Boltzmann’s constant and T is absolute temperature. During the duration of 1 perceptual moment 1 bit of information is processed per harmonic. That means that 1 break of symmetry and 1 phase reversal after each zero-crossing of an EEG wave corresponds with a possible 1 bit decision between two alternatives. Consequently, each degree of freedom and of translation (this refers to mathematical group theory [28] underlying both mental rotation and quantum mechanics) corresponds to an energy of kT/2 or its macroscopic analogon.

   Because the frequency of EEG harmonics can be expressed as n x 2F Hz, for the expected latencies of harmonics follows 1000 ms/n x 2F  and for power density follows E = Sum n  x 2F  (kT x ln2). The physical term power is appropriate because it is a measure of the ability of waves at frequency f to do work. The power spectrum to the EEG describes the total variance in the amplitudes due to the integer multiples of the fundamental frequency (i.e. the first harmonic 1 x 2F). In order to calculate power density in this way, the waveform must be squared and then integrated for the duration of its impulse response, i.e. the duration of the transient of 1 complete wave packet containing all the harmonics of the memory span of a given subject. 

   The relationships in Table 1 are further supported by data from Bennett [29], who reanalysed the Ertl and Schafer [30] findings of a correlation between IQ and latencies of EEG evoked potential components. Bennett (confirmed by Flinn [31]) accomplished  a Fourier transformation of the original data and found that high IQ subjects (IQ above 123) go through 20 or more perceptual moments per second, low IQ subjects (IQ below 75) only through 8 moments or even less (compare Table 1, columns b and d). This striking parallelism between EEG results and channel capacity, measured with mental tests, is emphasised by results from Harwood and Naylor [32]. 42 young university students had a mean channel capacity of 21.4 bits/s; 105 "average normal" adults who were 60-69 years old performed 14.2 bit/s; the age group of 70-79 years (sample size was 67) achieved 12.9 bits/s; and 13 subjects being 80 years and older 10.2 bits/s, thus reflecting the usual decline of mental performance of old aged people. Pure coincidence in this parallelism of channel capacity and EEG frequencies (compare Table 1) seems impossible: neither Liberson nor Lehrl, neither Bennett nor Naylor nor Pascual-Leone knew anything about the results and theories of the others.

   Higher IQ subjects have not only a higher memory span, but consequently also more complex waveforms of EEG than lower IQ subjects. The most extreme compression of information is represented by the eigenvalues of the power spectrum. There are as much eigenvalues of a spectrum as are harmonics [33]. Already in 1959 Burch (cited from Saltzberg and Burch [34]) had found that "the parameters ... of the power spectral density ... can be estimated in a completely adequate way without the necessity of performing squaring and integrating operations but simply by counting the zero crossings." The number of zero-crosses up to the P300 of evoked potentials is the upper bound of the memory span of an individual.

   In such a way memory span has to be understood as the quantum of action of thought. In fact, these quanta of action represent macroscopic ordered states in the sense of quantum mechanics. Empirical analysis shows that Liberson’s fundamental is lower than 3.3 Hz and in the range between 3.1 and 3.3 Hz. The reliability of the empirical data allows no more precise calculation. Nevertheless, it could be imagined that a numerical constant underlies the harmonics of the EEG, enabling brain waves to process information in the most efficient way. Because Hz is a man-made measure, depended on the definition of the second, an exact solution seems to be mere numerology and no scientific argument. Despite this, the congruence between multiples of memory span and multiples of a fundamental brain wave is the first important discovery derived from Table 1, the precise size of the fundamental seems to be a problem of second order.

   From technical applications we know that an array consists of equally spaced sensors making measurements at discrete intervals [35]. Only under this condition frequency bands and wavenumber can be detected in the spatiotemporal domain. If a travelling wave is spatially sampled using such a discrete array of sensors, an estimate of the wave is obtained by appropriately delaying or advancing the signals on each of the channels and summing the results. Therefore the idea that brain architecture and neural networks, respectively, should be understood in terms of sequences of delaying chains and matched filters facilitating run-length coding is not a new one [36]. 


5.  The golden mean as resonant frequency

   It is a psychoacoustic fact, known as octave equivalence [37] that all known musical cultures consider a tone twice the frequency of another to be, in some sense, the same tone as the other (only higher). The point of resonance, corresponding to the eigenvalues and zero-crossings of a wave packet (wavelet), is not the frequency of its fundamental, but half of its frequency.  If we assume the fundamental to be twice the golden mean [38] F, that means 2 x 1.618 = 3.236 Hz, a point of resonance at F = 1.618 Hz follows.

   Datta [39] showed how a sense of time and evolution is intrinsically defined by the infinite continued fraction of the golden mean and its inverse. The real number set gets replaced by an extended physical set, each element of which is endowed with an equivalence class of infinitesimally separated neighbours in the form of random fluctuations. Time thereby undergoes random inversions generating well defined random scales, thus allowing a dynamical system to evolve self similarly over the set of multiple scales. These random fluctuations generate 1/f noise, which is one of the footprints of complexity at the critical border between predictable periodic behaviour and chaos. Datta was unaware of some empirical results already supporting his theory. The distribution of the time elapsed between two consecutives spikes in the firing response of visual cortex neurons has been studied in cat [40] and macaque [41]. The distribution of time intervals clearly follows a power law over several orders of magnitude. In both experiments the exponent of the time separating two firings was roughly equal to 1.60 ( » F).

   According to Datta [39], it seems reasonable to assume that time may change from t- to t+, not only with the usual arrow, but also instantaneously by an inversion. The definition of time inversion has an inbuilt uncertainty, thus elevating time itself to the status of a random variable. However, in the midst of all fluctuations there exists the golden mean equation  f (f)2 + f ( f) = 1. In his theoretical approach, to understand all physical constants as random sample of independent numbers following a 1/x probability law, Frieden came to the conclusion that the median value of all constants ought to be precisely 1, and he stated: “Why the value 1 should have this significance is a mystery. The probability density function is invariant to a change of units. Therefore, the median of the constants is 1 independent of the choice of units. This gives added strength to the result and to the mystery” [42; p. 226]. “This result holds independent of units, inversion and combination, since the 1/x law itself is invariant under these choices. Therefore, the median value of 1 is a physical effect” [42; p. 233]. We see no other solution to this mystery as the golden mean equation (for the even deeper relationship between the golden mean and the prime number distribution see, for example, [43]). In other words: For any observer there is no simple 1 in the world but only the golden mean as the only point of certainty of any measurement. Only if K = 1,  the point of the nearest neighbour coupling strength K of the block-spin technique remains for ever on the ridgeline of a hyperbolic paraboloid  (cited from [38] p. 156).

   In 1995 Gilden et al. [44] asked  subjects to reproduce m times a given time interval, chosen between 0.3 and 10 s, by pushing a button on a keybord. The error was then recorded, interpreted as a time series and its power spectrum computed. The resulting power spectrum behaved like 1/fy  with y about 1.

  This insight that the measurement of any physical quantity and quality is based on repetitions of the golden mean, opens an astounding variety of possibilities to encode and decode information in the most efficient way. With this property the brain can use simultaneously the powers of the golden mean and the infinite Fibonacci word [45] (synonymously called the golden string, the golden sequence, or the rabbit sequence) for coding and classifying. Every positive integer can be a sum of Fibonacci numbers; it can also be understood as a finite sum of positive and negative powers of the golden mean. A binomial graph of a memory span n has n distinct eigenvalues and these are powers of the golden mean. The number of closed walks of length k in the binomial graph is equal to the nth power of the (k+1)-st Fibonacci number [46]. The total number of closed walks of length k within memory is the nth power of the kth Lucas number.

   Lifeforms maximise their adaptive capacities by entering the region of complexity on the edge of chaos. From the period doubling route of chaos it turns out that when R = 2 F = 2 times 1.618 = 3.236 one gets a super-stable period with two orbits, producing the first island of stability. Thus, the quasi periodic F  toroid geometry is the most stable under perturbation. The orbit is of the lowest period possible (being two) and therefore crucially, consumes the least energy to maintain. Bands of order in the Feigenbaum diagram occur at a fixed scaling mean, where all bifurcations, representing the length w1 = (F – 1)/2, are positioned at a = 2 F  [47].  This is how F is embedded within dynamical systems, as a universal binary shift operator, or primary eigenfunction. All constants so derived have to be eigenvalues of this operator (think of resonances and harmonics).

   The existence of a Fibonacci series and the convergence of the ratio of the winding numbers of an orbit towards f =(Ö5 – 1)/2 = 1/F  in a Hamiltonian system is a numerically well-know phenomenon of physics. The mathematical foundation and proof of this phenomenon is the essence of the theorem of Kolmogorov, Arnold and Moder (KAM). From this theorem follows, too, that the golden mean, which is the most irrational number, must give the most stable orbit.  Irrational values of the winding number correspond to an uncountable set of zero measures of values – in other words the irrationals are squeezed into a Cantor dust [48].

   A slide-rule computes products because the marks on the sliding ruler correspond to logarithms, and adding two logarithms is equivalent to multiplying the corresponding pair of numbers. Also the Fibonacci and Lucas numbers can be understood like the markings on a ruler that is recursively divided into golden mean pieces. By using powers of the golden mean any multiplication can be reduced to an addition. The golden mean is the mean of the sides of a rectangle circumscribed about a logarithmic spiral, too. Logarithmic spirals are, like fractals, self-similar at all scales. Therefore our brain performs visual computation at several scales (demagnifications of the image) and compares the results [49]. With a sampling algorithm, based on Fibonaccis and phyllotaxis [50], even coloured  images can be quantized and processed [51].

   If we draw a line y = F x on a graph (i.e. a line whose gradient is F) there we can see directly the binary expression of the Fibonacci sequence known as the infinite Fibonacci word. Where the F line crosses a horizontal grid line (imagine the discrete columns of the brain) we write 1 by the line and where the F line crosses a vertical line we write a 0. As we travel along the F line from the origin we meet a sequence of 1s and 0s. The 1s in the Fibonacci string 1011010110… occur at positions given by the spectrum of F and only at those positions [52]. Trajectories of dynamic systems whose phase spaces have a negative curvature everywhere can be completely characterised by such a discrete sequence of 0s and 1s. The self-similar Fibonacci string reproduces itself upon reverse mapping or decimation, both fundamental properties from the point of information storage and retrieval. After decimation by a factor of the golden mean every unit in the original lattice coincides precisely with a unit in the compressed lattice.  From the point of view of renormalization theories of physics, the decimation process is the complement of deflation or block renaming. Any 1 in the Fibonacci string forces an infinite number of symbols in a characteristic quasi periodic pattern. For any such Sturmian sequence the topological structure completely determines all the Markov approximations. It means that only one ergodic measure is compatible with the topological structure.

   For computer science the Fibonacci string is no newcomer [53]. Processing of strings of symbols and string rewriting is the most fundamental and the most common form of computer processing: every computer instruction is a string, and every piece of data processed by these instructions is a string. A repetition in a string is a word of the form 11 or 00, called a square. The frequency of such squares is a function of the logarithm of the golden mean [54].

    Since the fabrication of semiconductor superlattices arranged according to the Fibonacci and other sequences, there has been a growing interest in their electronic properties. When a homogeneous electric field is applied perpendicular to the layer plan, electronic states become localised and the energy spectrum consists of a Wannier-Stark-ladder, characterised by a sequence of metastable states of resonance separated by equal energy intervals. An initial Gaussian wave packet is filtered selectively when passing through the superlattice. This means that only those components of the wave packet whose wave number belong to the allowed harmonics of the fractal-like energy spectrum can propagate over the lattice. Diez et al. [55] discuss therefore, aside from the possibility of building filterlike devices, designed with Fibonacci or a binary quasi periodic sequence according to the desired application, the possibility that such a kind of system can be used in processing information. Surely, the insight that our brain uses very similar physical and mathematical properties will accelerate technical progress in this area. Cuesta and Satija [56], studying empirically Fibonacci lattices with defects, even found: “Novel result of our studies is the relationship between the resonant states and the states where the energy bands cross. We show that the resonant states are fully transmitting states in the quasi periodic limit and are described by the wave functions that are related to the harmonics of the sine wave with fundamental Bloch number equal to the golden mean.”  Bloch Waves are known as the most important effect due to the discrete lattice translational symmetry. This arises because the Hamiltonian must commute with the translational operator for any discrete integer lattice translation. The wave function can be represented as the product of a plane wave with a periodic function. The translational invariance of the wave function is of utmost importance. This basically indicates that all information about the system is stored within an excited subset of the system; the rest of the non-resonant information is redundant at this very moment. If we stress the analogy between waves in quasi periodic lattices and the phenomenon of memory span in our brain, this seems to be an especially important point. We confess to have the vision of multilayer hierarchical binary or Fibonacci semiconductor superlattices simulating the calculating and classifying capabilities of our brain, far surpassing the brain by the higher speed of the technical application.

    It is already well-known among electrical engineers [57] that the characteristic impedance of an electrical ladder network, which is needed for an error-free connection, has to be a function of the golden mean. Even the sound by any stereo system depends on the purity of the audio signal it produces. Each strand in a cable has its own beat. When the cable linking all components together imparts its own sound, the audio signal is corrupted. George Cardas received U.S. Patent Number 4,628,151 for creating Golden Mean Stranding Audio Cable. Individual strands are arranged so each strand is coupled to another, whose note or beat is irrational with its own, thus nulling interstrand resonance.


6. The universe as a world of numbers

    El Naschie [58], [60], [61] and others developed for the fundamental question of time reversibility the notion of a Cantorian space-time (compare the idea of Cantor coding by Tsuda [59]). What is really remarkable of this Cantorian space-time is that applying all the probabilistic necessary laws, the values of the Hausdorff dimension are intrinsically linked to the golden mean and its successive powers. The correlated fluctuations of the fractal space-time are analogous to the Bose-Einstein condensation phenomenon. The polynomial roots of higher order Fibonaccis, scaling a quasi periodic hierarchy, are based on golden mean powers.

   There can be no doubt that our brain uses for computing inherent and inborn properties of the physical universe.  We have or learn into the neural network of our brains the relationships between external stimuli, the integer powers of the golden mean, the Fibonacci word and Lucas numbers, and we are probably able to use the relationships between the Beatty sequences of e, p and F, and we use hundreds of similar relationships (many of them may still be undiscovered by contemporary mathematics) between numbers for encoding and decoding information simultaneously and unconsciously by wavelets. A genius like Ramanajun gave us some closed fraction formulae which contain p, e and F all together in a single equation. Together with Euler’s famous formula eip + 1 = 0 for the unit circle we all understand in our subconsciousness these irrational numbers as rules for superposition and time reversal by folding, symmetry breaking and compactification. By raising F = 1 + f  to the third power, we get the Hausdorff dimension of Cantorian spacetime of El Naschie [61]

 (1 + f)3  = 2 + Ö5 =  4 + f3   = (1 + f)/(1 - f) = 1/f3   = 4.236, which plays also a profound role in knot theory, von Neumann`s algebra, quasi crystals and noncommutative geometry. But who could expect such a result and such connections on the basis of deceptively simple mathematics?

   Quantum mechanics seems to require the quantization of all physical quantities on the small scale, yet space and time are still treated in most cases as a classical space-time continuum, where there are an infinite number of space points between any two given locations, no matter how close. Therefore many physicists agree that the current set of fundamental physical laws is incomplete. Because Hz, oscillations per second, is superficially seen only a man-made measure, this seems to be the weakest point of our line of reasoning. Behind the definition of the second is the velocity of light (c =  299792458 m/s), which is the constant on which size all other physical constants depend upon and hence represents the inherent speed limit that any particle information pattern is able to achieve. In this system of present-day constants the Planck length has the value 1.6160 x 10-35 m (standard uncertainty 0.0012 x 10-35 m). If we fix instead the Planck length at the value of the golden mean at 1.6180 x 10-35 m and recalculate consequently all other physical variables, this means for the numerical size of the second only a trivial correction not relevant for our argument.

   Indeed, there is a growing minority of scholars who understand the world as something like a cellular automaton running with and counting numbers. The numerical state of all the cells, everywhere, changes at a regular synchronised interval called a clock cycle. The universal cellular automaton seems to be capable of updating its entire memory in a single clock cycle, which according to Occam’s razor could be nothing else than the Planck time as the relation between the velocity of light and the Planck length, the latter fixed at the value of the golden mean. If we, for example, look into Wolfram’s “A New Kind of Science” we see that this Model 3 and Model 4 automata are full of Pascal triangles. Behind such a triangle are always the Fibonaccis and hence the golden mean. That means that by encoding and decoding the information of such and any automaton or system no other wave could be more optimal than a wavelet containing the golden mean itself. The quantization of time simply represents the number of regular clock cycles elapsed between two events and all changes that occur must occur as localised changes. At the lowest level our brain seems to be utterly simple, deterministic and mathematical in nature. Despite this, we can never read out the numeric state of any brain in a foreseeable future. We can only infer this type of information by observing larger scale patterns as it is the phenomenon of memory span. 


7.  Conclusion

   Our paper is not an end, but a beginning.  A new theory, in fact any theory must end in one way or another by confirming what we all know to be the case, namely that space-time forms an effectively four-dimensional manifold compatible with the space-time of classical physics as well as general relativity. In addition, if this theory is to be regarded as something new then it must be quantized in much the same way as in the Planck theory of radiation, only here it is space and time themselves that are quantized. None the less, at the end we must still recover our ordinary space-time, where measurements are being made and out of which is no escape. Since 1990 El Naschie [see 60]  is building such a theory, in which the golden mean as a universal constant is playing a fundamental role and in which context our metaanalysis of empirical results is not a logical chain of incredible wonders., but a logical consequence of the observed fact that the electric and other charges of particles are simple rational multiples, theoretically existing in a universe where time is fully spatialised and nothing more than a random Cantor set fluctuating with a golden mean Hausdorff dimension. In order to understand our brain, there is no other way than to come to a deeper understanding of the world around us.


   Without the availability of resources on the World Wide Web as the Fibonacci page, Eric Weissteins world of mathematics and physics, M. Watkins number theory and physics archive and many others our work would be impossible. We are indebted to P. Plichta, D. Winter, A. M. Selvam and V. W. de Spinadel for stimulation and critical remarks.


[1] Ilachinski A.  Cellular automata: a discrete universe. Singapore: World Scientific; 2001.

[2] Wolfram S. A new kind of science. Champaign: Wolfram Media, 2002.

[3] Kac M.  Can one hear the shape of a drum? American Mathematical Monthly, 1966; 73 (part II):1-23.

[4] Fogleman G. Quantum strings. American Journal of Physics, 1987; 55: 330-336.

[5] Eysenck HJ. The theory of intelligence and the psychophysiology of cognition, in: Advances in the psychology of human intelligence 3 (ed. Sternberg RJ.). Hillsdale, NJ: Erlbaum; 1986, pp. 1-34.

[6] Kawai N, Matsuzawa T. Numerical memory span in a chimpanzee. Nature, 2000; 403:39-40.

[7] Miller GA. The magical number seven, plus or minus two: some limits on our capacity for processing information. Psychological Review, 1956; 63:81-97.

[8] Piaget J. The theory of stages in cognitive development. In:  Measurement and Piaget (ed. Green DR.). New York: Mc Graw Hill; 1971, pp. 1-11.

[9] Pascual-Leone J. A mathematical model for the transition rule in Piaget’s developmental stages. Acta  Psychologica, 1970; 32:301-345.

[10] Halford G. The development of intelligence includes capacity to process relations of greater complexity. In: Development of intelligence (ed. Anderson M.). Hove: Psychology Press; 1999, pp. 193-213.

[11] Humphreys LG., Rich SA, Davey TC. A Piagetian test of general intelligence. Developmental  Psychology, 1985; 21:872-877.

[12] Weiss V. The relationship between short-term memory capacity and EEG power spectral density. Biological Cybernetics, 1992; 68:165-172. – see

[13] Lehrl S, Gallwitz A, Blaha L., Fischer B. Geistige Leistungsfähigkeit. Theorie und Messung der biologischen Intelligenz mit dem Kurztest KAI. Ebersberg: Vless; 1991.

[14] Schröder M. Fractals, chaos, power laws. New York: W.H. Freeman; 1991.

[15] Bianconi G,  Barabási AL.  Bose-Einstein condensation in complex networks. Physical  Review Letters 2001; 86:5632-5635.

[16] Albert R, Barabási AL. Statistical mechanics of complex networks. Reviews of Modern Physics 2002; 74:47-97.

[17] Wyner  AD, Ziv J,  Wyner A J. On the role of pattern matching in information theory.  IEEE Transactions of Information Theory, 1998; 44:2045-2056.

[18] Frank HG. Bildungskybernetik. München: Kopäd; 1996.

[19] Limpert E., Stahel WA, Abbt  M. Lognormal distributions across the sciences: keys and clues. Bioscience, 2001; 51:341-352.

[20] Naylor GFK. Perception times and rates as a function of the qualitative and quantitative structure of the stimulus. Australian Journal of Psychology, 1968; 20:165-172.

[21] Lehrl S, Fischer B. A basic information psychological parameter (BIP) for the reconstruction of concepts of intelligence. European Journal of Personality, 1990; 4:259-286. - see The Basic Period of Individual Mental Speed (BIP)

[22] Baddeley AD, Thomson N, Buchanan N. Word length and the structure of short-term memory. Journal of Verbal Learning and Behaviour, 1975; 14:575-589.

[23] Weiss V. Memory span as the quantum of action of thought. Cahiers de Psychologie Cognitive, 1995; 14:387-408. – see

[24] Zipf GK. Human behavior and  the principle of least effort. Cambridge, MA: Addison-Wesley; 1949.

[25] Gibbs FA, Davis H, Lennox WG. The electroencephalogram in epilepsy and in conditions of impaired consciousness. Archivs of  Neurology and Psychiatry, 1935; 34:1133-1148.

[26] Liberson WT. The electrophysiology of intellectual functions (by Giannitrapani D.). Basel: Karger, 1985; pp. 153-176.

[27] Gershenfeld N. Signal entropy and the thermodynamics of computation. IBM Systems Journal, 1996; 35:577-586.

[28] Goebel PR. The mathematics of mental rotations. Journal of Mathematical Psychology, 1990; 34:435-444.

[29] Bennett ER. The fourier transform of evoked responses. Nature, 1974; 239:407-408.

[30] Ertl JP., Schafer EWP. Brain response correlate of psychometric intelligence. Nature, 1969; 223:421-422.

[31] Flinn JM, Kirsch AD, Flinn EA. Correlations between intelligence and the frequency content of the evoked potential. Physiological Psychology, 1977, 5:11-15.

[32] Harwood E, Naylor G FK. Rates and information transfer in elderly subjects. Australian Journal of Psychology, 1969, 21:127-136.

[33] Wentzel G. Eine Verallgemeinerung der Quantenbedingungen für die Zwecke der Wellenmechanik.  Zeitschrift für Physik, 1926; 38:518-529.

[34] Saltzberg B, Burch NR. Periodic analytic estimates of the power spectrum: a simplified EEG domain procedure. Electroencephalography and Clinical Neurophysiology, 1971, 30:568-570.

[35] Bath M. Spectral analysis in geophysics. Amsterdam: Elsevier; 1974.

[36] Reiss, R. F. Neural  theory and modelling. Stanford: Stanford University Press, 1964; pp. 105-137.

[37] Glassman RB. Hypothesized  neural dynamics of working memory: Several chunks might be marked simultaneously by harmonic frequencies within an octave band of brain waves. Brain Research Bulletin, 1999; 50:77-94.

[38] de Spinadel, VW. From the golden mean to chaos. Buenos Aires: Nueva Libreria; 1998.

[39] Datta DP. A new class of scale free solutions to linear ordinary differential equations and the universality of the Golden Mean (Ö5-1)/2.  Chaos, Solitons & Fractals, 2002; arXiv:nlin.CD/0209023 v1 11 Sep 2002.

[40] Koch C. Computation and the single neuron. Nature, 1997; 385:207-210.

[41] Papa ARR, Da Silva L. Earthquakes in the brain. Theory in Biosciences, 1997, 116: 321-327.

[42] Frieden BR. Physics from Fisher information. A unification. Cambridge: Cambridge University Press; 1998.

[43] Planat M. Modular functions and Ramanujan sums for the analysis of 1/f noise in electronic circuits. arXiv:hep-th/0209243 v1 27 Sep 2002.

[44] Gilden DL, Thornton D., Mallon MW. 1/f noise in human cognition. Science, 1995, 267: 1837-1839. 

[45] Frougny C, Sakarovitch J. Automatic conversion from Fibonacci representation to representation in base phi, and a generalization. International Journal of Algebra and Computing, 1999; 9:351-384.

[46] Kennedy JW,  Christopher PR. Binomial graphs and their spectra. Fibonacci Quarterly, 1997; 35:48-53.

[47] Schürmann T. Scaling behavior of entropy estimates. Journal of Physics A: Math. Gen., 2002; 35:1589-1596.

[48] El Naschie MS. Quantum groups and hamiltonian sets on a nuclear spacetime Cantorian manifold. Chaos, Solitons & Fractals, 1999, 10:1251-1256.

[49] Weimann C, Chaitin G. Logarithmic spiral grids for image processing and display. Computer Graphics Image Processing, 1979, 11:197-226.

[50] Kapraff J, Blackmore D, Adamson G. Phyllotaxis as a dynamical system: a study in number. In: Symmetry in Plants (eds. Jean RV, Barabé D.). Singapore: World Scientific; 1998, pp. 409-458.

[51] Mojsilovic A, Soljanin E. Color quantization and processing by Fibonacci lattices. IEEE Transactions on Image Processing, 2001, 10:1712-1725.

[52] Kimberling C. A self-generating set and the golden mean.  Journal of  Integer Sequences, 2000, 3:Article 00.2.8.

 [53] Berthè V, Ferenczi S, Mauduit, C., Siegel, A (eds.) Substitutions in dynamics, arithmetics and combinatorics. New York, Springer; 2001.

[54] Mignosi F, Restivo A, Salemi S. Periodicity and the golden ratio. Theoretical Computer Science, 1998; 204:153-167.

[55] Diez E, Dominguez-Adame, F, Maciá E., Sánchez A. Dynamical phenomena in Fibonacci semiconductor superlattices. Physical Review B, 1996; 54:792-798.

[56] Cuesta IG,  Satija II. Dimer-type correlations and band crossings in Fibonacci lattices, 1999; arXiv:cond-mat/9904022.

[57] Srinivasan TP. Fibonacci sequence, golden ratio, and a network of resistors. American Journal of Physics, 1992; 60:461-462.

[58] El Naschie MS. On the unification of heterotic strings, M theory and E (¥) theory. Chaos, Solitons & Fractals, 2000; 11:2397-2408. 

[59] Tsuda I. Towards an interpretation of dynamic neural activity in terms of chaotic dynamical systems. Behavioral and Brain Sciences, 2001, 24:793-847.

[60] El Naschie MS. Modular groups in Cantorian  E (¥)  high-energy physics. Chaos, Solitons & Fractals, 2003; 16:353-366.

[61] El Naschie MS. Kleinian groups in  E (¥)  and their connection to particle physics and cosmology. Chaos, Solitons & Fractals, 2003; 16:637-649.

[1] Some authors call its inverse F (=(Ö5+1)/2 = 1.618033) the golden mean. We hope this will cause no confusion.

Among others, The golden mean as clock cycle of brain waves has been cited by:

Elio Conte, Orlando Todarello, AntonioFederici, Francesco Vitiello, Michele Lopane, Andrei Khrennikov and Joseph P. Zbilut: Some remarks on an experiment suggesting quantum-like behavior of cognitive entities and formulation of an abstract quantum mechanical formalism to describe cognitive entity and its dynamics. Chaos, Solitons and Fractals 31 (2007) 1076-1088

See also: "golden mean" "brain waves"

El Naschie's review of infinity theory. Chaos, Solitons and Fractals 19 (2004) 209-236 . "Do not quantize and do not merely discretize. You should discretize transfinitely. ... If we imagine an infinite collection of two degrees of freedom unit cells constructed sequentially and in parallel at random, then we need only to introduce a so-called wired hierarchy in the architecture of our neural network like structure and we would have some reasonable mechanical realisation of infinity space, an infinite collection of possibly nested oscillators. ... I have used in this context the well known eigenvalue theorem of Southwell and Dunkerly to show that the expected hierarchy of frequencies of vibrations are simple or complex function of the golden mean."

For a general information on the importance of the golden mean (golden section), its mathematics and further links we recommend

The excellent Fibonacci Numbers and The Golden Ratio Link Web Page

For the historical background of the golden mean we recommend