User:FMabey/Bremmerman's limit

Bremermann's Limit, named after Hans-Joachim Bremermann, is the maximum computational speed of a self-contained system in the material universe. It is derived from Einstein's mass-energy equivalency and the Heisenberg uncertainty principle, and is c2/h ≈ 1.36 × 1050 bits per second per kilogram.

How it is derived edit

Bremermann initially states that there are three physical barriers which create limits on computational speed; the light barrier, the quantum barrier and the thermodynamic barrier. The light barrier is simply that computers cannot defy the laws of physics, so can pass information no faster than the speed of light. (Brief summary of the thermodynamic barrier here too, as although it's relevant it isn't the actual topic.)

Under the quantum barrier, he makes the following proposition: that the capacity of any closed information transmission or processing system does not exceed mc2/h bits per second, where m is the mass of the system, c the light velocity, h Planck's constant.[1]

A rundown on the three physical barriers that combine to create limits on physical computation. Most detail under quantum barrier, as this is where Bremermann's limit really comes in.

Quantum barrier - theory relies on dual particle-wave theory, Shannon's definition for capacity of a continuous channel, basic quantum mechanic rules (phase cannot be measured, only amplitude), uncertainty principle. Perhaps Von Neumann's markers.

  • energy intervals
  • signal/noise power
  • accuracy of measuring E
  • E=hν (dual particle-wave theory)
  • E=mc^2
  • outline that above two formulas are combined - explain each ones relation to the topic - explain that using other parts proves v-max.
  • use detailed example of hydrogen atom?
  • complications introduced by thermal barrier

Uses in context edit

Computer science edit

This value is important when designing cryptographic algorithms, as it can be used to determine the minimum size of encryption keys or hash values required to create an algorithm that could never be cracked by a brute-force search. For example, a computer the size of the entire Earth, operating at the Bremermann's limit could perform approximately 1075 mathematical computations per second. If we assume that a cryptographic key can be tested with only one operation, then a typical 128 bit key could be cracked in under 10−36 seconds. However, a 256 bit key (which is already in use in some systems) would take about two minutes to crack. Using a 512 bit key would increase the cracking time to approaching 1072 years, without increasing the time for encryption by more than a constant factor (depending on the encryption algorithms used).

The limit can also be used to calculate the complexity of a given algorithm. An algorithm is considered transcomputable if its computational cost exceeds all bounds that govern the physical implementation of algorithms.[2] Returning to the example of a computer the size of the Earth, which in this case has existed for the same length of time as the Earth, we can calculate it could have processed at most 2.56×2092 ≈ 2.54×10121 bits.[3] This may seem like a large number at first glance, but in comparison to the complexity of many problems it is actually quite a small number. When compared to the number of possible games in Go, where it can be shown that a game of 50 moves on a 19×19 board has approximately 7.5×10127 different game trees,[4] it is clear that trying to calculate all possible games would be a transcomputable problem using our theoretical computer.

Bremermann's limit also provides a limit to the effect on processing power of the often cited Moore's law, which states that the number of transistors that can be placed inexpensively on an integrated circuit will double approximately every two years.

Computational problems in science edit

Topics like genetics[5], evolution[6] and chemistry[7] in science involve a large number of building blocks which can lead to very large numbers of permutations. Solving problems in these areas often requires plenty of processing power. Although some projects use ingenious methods like Folding@home to attempt to gather more processing power to help find solutions, Bremermann's limit shows that these types of calculations are ultimately transcomputable. Bremermann summarizes this by saying that sheer data-processing will not solve such problems, and asserts that instead, scientists must use heuristic methods.[6]

The brain and philosophy edit

The limit is not just applicable to computers as we may traditionally think of them. It is also relevant for natural information processing - for example, protein or gene interaction. This includes our own brains - so human thinking is as limited by this value as a desktop PC would be.

Because it is also applicable to natural computers, Bremermann's limit has implications in both dualism[8] and epistemology[9].

Related theories edit

Delve more into the note at the end of Quantum Noise and Information about similar theories by D. S. Lebedev and L. B. Levitin.
If there are existing wiki pages, a brief mention and a link. If there aren't, a brief summary with links ready for if those pages are made.

Recent developments in cGh physics have brought about a theory for a possible new value for Bremermann's limit.[10]

Notes and references edit

  1. ^ Bremermann, H.J. (1965) Quantum noise and information. 5th Berkeley Symposium on Mathematical Statistics and Probability; Univ. of California Press, Berkeley, California.
  2. ^ Bremermann, H.J. (1977) - "Transcomputability and Complexity" in Smith, M. & Duncan, R.(eds) - The Encyclopaedia of Ignorance London: Routledge and Keagan Paul.
  3. ^ Wallave, D.F. (2003) - "Everything and More". Weidenfeld & Nicolson.
  4. ^ Tromp, J. & Farnebäck, G. (2006) - "Combinatorics of Go". Proceedings of the 5th international conference on Computers and games.
  5. ^ Gatherer, D. - Less is More. BioSysBio 2007: Systems Biology, Bioinformatics and Synthetic Biology Manchester, UK.
  6. ^ a b Bremermann, H.J. (1962) Optimization through evolution and recombination In: Self-Organizing systems 1962, edited M.C. Yovitts et al., Spartan Books, Washington, D.C. pp. 93–106.
  7. ^ Glanville, R. A (Cybernetic) Musing: Variety and Creativity. Cybernetics And Human Knowing, Vol.5, no.3, 1998, pp. 56–62.
  8. ^ Joao Teixeira - Computational Complexity and Philosophical Dualism. Universidade Federal de S. Carlos, Brazil.
  9. ^ Holtz, B. - Philosophy of Mind: A Functionalist Primer. in: Human Knowledge: Foundations and Limits.
  10. ^ Gorelik, G. (2009) - Bremermann's Limit and cGh-physics. ArXiv.


Category:Cybernetics Category:Theory of computation Category:Theoretical physics