Programming the Universe by Seth Lloyd - ISBN 9781400040926 - Random House 2006

Improving my understand of physics through the door of information. Repetitively kept on trying to maintain a physicalist coherence yet constantly manipulating abstractions thus wondering what there grounding or physical constituent are.

- theory
- everything is an algorithm
- everything alive is an optimization algorithm

- everything is an algorithm
- resulting tool
- Augmented reality heuristics system from Seedea SandIdeabox

Draw a schema (using PmGraphViz or another solution) of the situation of the area in the studied domain before having read the book.

- Prologue: The Apple and the Universe
- "The significance of a bit of information depends on how that information is processed." (p7)
- "The significance of a bit depends not just on its value but on how that value affects other bits over time, as part of the continued information processing that makes up the dynamical evolution of the universe." (p7)

- 1 Introduction
- "The universe is made of bits. Every molecule, atom, and elementary particle registers bits of information. Every interaction between those pieces of the universe processes that information by altering those bits. That is, the universe computes, and because the universe is governed by the laws of quantum mechanics, it computes in an intrinsically quantum mechanical fashion; its bits are quantum bits. The history of the universe is, in effect, a huge and ongoing quantum computation. The universe is a quantum computer." (p10)
- "Quantum mechanics is the branch of physics that deals with matter and energy at its smallest scales." (p11)
- one could wonder if it wouldn't be more precise to say the "smallest scales so far" or "smallest known scales" based on the evolution of the concept of atom in history of science
- consequently could this theory benefits from Wikipedia:Scale relativity (as far as it is correct...)

- "The digital revolution under way today is merely the latest in a long line of information-processing revolutions stretching back through the development of language, the evolution of sex, and the creation of life, to the beginning of the universe itself." (p11)
- "a quantum bit, or <<qubit>>, can register both 0 and 1
*at the same time*" (p11) - "A quantum computer is a democracy of information: every atom, electron, and photon participates equally in registering and processing information." (p11-12)
- "Moore’s law is a law not of nature, but of human ingenuity." (p12)
- "Even if this exponential rate of progress can be sustained, it will still take forty years [written in 2006] before quantum computers can match the number of bits registered by today’s classical computers." (p13)
- "One of the best ways to understand a law of nature is to build and operate a machine that illustrates that law." (p13)
- "Once we have seen how quantum computers work, we will be able to put bounds on the computational capacity of the universe." (p13)
- "
*Physical systems speak a language whose grammar consists of the laws of physics.*" (p14) - "The information-processing technology (e.g., the abacus) is typically inseparable from the conceptual breakthrough (e.g., zero)." (p16)
- see also my notes on Cognitive Archeology

- "communication. Every successful mutation, every instance of speciation, constitutes an advance in information processing. But for an even greater revolution, dwarfing all that followed, we turn the clock back a billion years, to the invention of sex." (p17)
- see also my notes on TheRedQueen by Matt Ridley in 1993 and TheMatingMind and Geoffrey Miller in 2000

- "Moving even farther back in time, we come to the grandmother of all information-processing revolutions, life itself." (p18)
- "The amount of information in a gene can be measured: the human genome possesses some 6 billion bits of information." (p18)
- but according to Wikipedia:Evolutionary developmental biology (aka EvoDevo) it could be slightly more complex as DNA interprets itself, generating a form of compression (or bootstrapping gradually more complex interpreters)

- "the sum total of all genetic information processing performed by living organisms dwarfs the information processing performed by man made computers, and should continue to do so for quite some time." (p18)
- "The machine performing the <<universal>> computation is the universe itself." (p18)

- 2 Computation
- "it is far easier to measure a quantity of information than to say what information is." (p19)
- explanation of bits, binary digits, counting in binary, powers of 2, ...
- "The reason it is so hard to pin down is that the meaning of a piece of information depends very much on how the information is to be interpreted." (p24)
- "If you adopt Wittgenstein’s perspective that the meaning of a piece of information is to be found in the action this information provokes, the meaning of a computer program written in a particular computer language is to be found in the actions the computer performs as it interprets that program." (p25)
- "The unambiguous nature of a computer program means that one and only one meaning is assigned to each statement. If a statement in a computer language has more than one possible interpretation, an error message is the result: for computers, ambiguity is a bug. By comparison, human languages are rich in ambiguity: except in special circumstances, most statements in, for example, English, have a variety of potential meanings, and this is a key aspect of poetry, fiction, flirting, and plain everyday conversation. The ambiguity of human language is not a bug, it’s a bonus!" (p25)
- "In the same way that words can represent ideas and things, so can bits. The word and the bit are means by which information is conveyed, though the interpreter must supply the meaning." (p26)
- "if you define a computer as a machine that processes information, then pretty much anything can compute." (p26)
- "Computers date back to the early days of
*Homo sapiens*. Like the first tools, the first computers were rocks. <<Calculus>> is the Latin word for pebble, and the first calculations were performed by arranging and rearranging just that." (p27)- see again my notes on Cognitive Archeology
- page 26 to 28 draw a very short but very interesting history view on computing accross the (human) ages

- "types of computers to which I will refer.
- A digital computer is a computer that operates by applying logic gates to bits; a digital computer can be electronic or mechanical.
- A classical computer is a computer that computes using the laws of classical mechanics.
- A classical digital computer is one that computes by performing classical logical operations on classical bits.
- An electronic computer is one that computes using electronic devices such as vacuum tubes or transistors.
- A digital electronic computer is a digital computer that operates electronically.
- An analog computer is one that operates on continuous signals as opposed to bits; it gets its name because such a computer is typically used to construct a computational <<analog>> of a physical system.
- Analog computers can be electronic or mechanical.
- A quantum computer is one that operates using the laws of quantum mechanics. Quantum computers have both digital and analog aspects." (p27)

- presentation of the different logic gates
- "NOT, COPY, AND, and OR [...] make up a universal set of logic gates." (p29)
- "When a computer computes, all it is doing is applying logic gates to bits" (p30)
- "All sufficiently powerful systems of logic contain unprovable statements. The computational analog of an unprovable statement is an uncomputable quantity." (p31)
- "Gödel showed that the capacity for self-reference leads automatically to paradoxes in logic" (p31)
- "Turing showed that self-reference leads to uncomputability in computers" (p31)
- "our own future choices are inscrutable to ourselves" (p31)
- "The inscrutable nature of our choices when we exercise free will is a close analog of the halting problem: once we set a train of thought in motion, we do not know whether it will lead anywhere at all. Even if it does lead somewhere, we don’t know where that somewhere is until we get there." (p32)
- "Averroës (Ibn Rushd) in his studies of Aristotle concluded that what is immortal in human beings is not their soul but their capacity for reason. Reason is immortal exactly because it is not specific to any individual; instead, it is the common property of all reasoning beings." (p32)
- was this philosphy part of the inspiration that lead George Boole to write "An Investigation of the Laws of Thought"?

- 3 The Computational Universe
- "The two descriptions, computational and physical, are complementary ways of capturing the same phenomena." (p33)
- "Quantum mechanics describes energy in terms of quantum fields, a kind of underlying fabric of the universe, whose weave makes up the elementary particles - photons, electrons, quarks." (p34)
- "in the story of the universe told in this book, the primary actor in the physical history of the universe is
*information*. Ultimately, information and energy play complementary roles in the universe: Energy makes physical systems do things. Information tells them what to do." (p34) - "entropy is the information required to specify the random motions of atoms and molecules - motions too small for us to see.
*Entropy is the information contained in a physical system that is invisible to us*." (p34) - "The laws of thermodynamics guide the interplay between our two actors, energy and information." (p36)
- "Free energy is energy in a highly ordered form associated with a relatively low amount of entropy." (p36)
- "The relatively small amount of information required to describe this energy makes it available for use: that’s why it’s called free." (p36)
- "The universe we see around us arises from the interplay between these two quantities, interplay governed by the first and second laws of thermodynamics. Energy is conserved. Information never decreases. It takes energy for a physical system to evolve from one state to another. That is, it takes energy to process information. The more energy that can be applied, the faster the physical transformation takes place and the faster the information is processed. The maximum rate at which a physical system can process information is proportional to its energy. The more energy, the faster the bits flip." (p36)
- The Story of the Universe: Part Two
- description of the information/energy evolution of the universe (p36-39)
- "It is this interplay -this back-and-forth between information and energy- that makes the universe compute." (p36)
- "If there were no alternatives to the initial state of the universe, then exactly zero bits of information were required to describe it; it registered zero bits. This initial paucity of information is consistent with the notion that the universe sprang from nothing." (p37)
- "Current physical theories suggest that the amount of energy in the early universe grew very rapidly (a process called <<inflation>>), while the amount of information grew more slowly. The early universe remained simple and orderly: it could be described by just a few bits of information. The energy that was created was free energy." (p37)
- "
*The Big Bang was also a Bit Bang*" (p37)

- "The theory of quantum mechanics gives rise to large-scale structure because of its intrinsically probabilistic nature." (p39)
- "The larger the clump grew, the hotter the matter became. If enough matter clumped together, the temperature in the center of the clump rose to the point at which thermonuclear reactions are ignited: the sun began to shine! The light from the sun has lots of free energy" (p40)
- "Alonzo Church and Alan Turing, hypothesized that any possible mathematical manipulation can be performed on a universal computer; that is, universal computers can generate mathematical patterns of any level of complexity." (p41)
- "The idea that the universe might be, at bottom, a digital computer is decades old. In the 1960s, Edward Fredkin [...] and Konrad Zuse [...] both proposed that the universe was fundamentally a universal digital computer." (p41)
- "there is no known way for them to perform a full-blown dynamical simulation of a complex quantum system without using vast amounts of dynamical resources. Classical bits are very bad at storing the information required to characterize a quantum system: the number of bits grows
*exponentially*with the number of pieces of the system." (p42) - "The failure of classical simulation of quantum systems suggests that the universe is intrinsically more computationally powerful than a classical digital computer." (p42)
- "The amount of time the quantum computer takes to perform the simulation is proportional to the time over which the simulated system evolves, and the amount of memory space required for the simulation is proportional to the number of subsystems or subvolumes of the simulated system. The simulation proceeds by a direct mapping of the dynamics of the system onto the dynamics of the quantum computer." (p43)
- "Quantum computers, then, are universal quantum simulators." (p43)
- "In fact, the universe is
*indistinguishable*from a quantum computer." (p43) - after describing Bolzmann's hypothesis of a universe created by an entirely random process, the author says "To create anything more complicated by a random process would require greater computational resources than the universe possesses." (p46)
- which let suppose that evaluation of the "computational resources [that] the universe posseses" are well known, it would be interesting to understand this evaluation and especially it's predictibility (if it can hypothesize an evolution of computational resources)
- see sections Physical Limits to Computation (p108-111) followed by The Computational Capacity of the Universe (p112-115) of chapter 7 The Universal Computer and also Further Reading

- which let suppose that evaluation of the "computational resources [that] the universe posseses" are well known, it would be interesting to understand this evaluation and especially it's predictibility (if it can hypothesize an evolution of computational resources)

- 4 Information and Physical Systems
- "the first law of thermodynamics is a statement about
*energy*: energy is conserved when it is transformed from mechanical energy to heat. The second law of thermodynamics, however, is a statement about*information*, and about how it is processed at the microscopic scale." (p49) - "Entropy (from the Greek for <<in turning>>) was first defined by Rudolf Clausius in 1865 as a mysterious thermodynamic quantity that limits the power of steam engines." (p50)
- "To understand what information has to do with atoms, look at the origins of the atomic hypothesis" (p52)
- "The atomic hypothesis was based on an aesthetic notion: distaste for the infinite. The ancients simply did not want to believe that you could keep subdividing matter into ever smaller pieces." (p52)
- "Temperature is energy per bit." (p53)
- introducing Maxwell's demon (p53-54)
- "
*It is the demon’s ability to get information about the atoms that allows him to accomplish this apparent violation of physical law.*" (p54) - "the entropy of a system was proportional to the number of bits required to describe the microscopic state of the atoms." (p55)
- "The epitaph on Boltzmann’s tomb reads <<S = k log W>>, which is just a fancy way of saying that the entropy of something is proportional to the number of bits registered by its microscopic state. Another way of saying the same thing is that the entropy is proportional to the length, in bits, of the number of the microscopic states. In this formula, k is known as Boltzmann’s constant." (p55)
- "But whether they knew it or not, the pioneers of statistical mechanics discovered the formula for information fifty years before the mathematical theory of information was in place." (p55)
- "Information can be created but it can’t be destroyed" (p56)
- a principle that was debated in regard to black holes and their potential usage in order precisely to compute, references needed

- "Erasure destroys the information in the bit. But the laws of physics do not allow processes that do nothing but erase a bit. Any process that erases a bit in one place must transfer that same amount of information somewhere else. This is known as Landauer’s principle, after Rolf Landauer, the pioneer of the physics of information, who discovered it in the early 1960s." (p56)
- "It is perhaps easier to conceive of an increase in entropy in these terms: energy degrading from useful to useless forms." (p58)
- "The answer lies in a fundamental fact of nature that I call <<the spread of ignorance>>. Unknown bits infect known bits." (p58)
- "The spread of ignorance is reflected in the increase of a quantity called <<mutual information>>." (p59)
- "The mutual information is equal to the sum of the entropies taken separately, minus the entropy of the two bits taken together. In other words, the two bits have exactly one bit of mutual information. Whatever information they have is held in common." (p59)
- "The infectious nature of information applies to colliding atoms as well as to bits in a computation" (p60)
- "This infection of macroscopic bits by microscopic ones is a feature of chaos. Recall that a chaotic system is one whose dynamics tend to amplify small perturbations, so that microscopic information is pumped up to the macroscopic regime." (p63)
- "Despite the confusion sown by Maxwell’s demon over the years, the final resolution is surprisingly simple: The underlying laws of physics preserve information. As a result, the total information/entropy of the gas and demon
*taken together*cannot decrease." (p66) - "Physical dynamics can be used to get information, and that information can be used to decrease the entropy of a particular element of a system, but the total amount of information/entropy does not decrease." (p67)
- "computational power requires physical resources, Laplace’s demon would have to use at least as much space, time, and energy as the universe itself." (p70)

- "the first law of thermodynamics is a statement about
- 5 Quantum Mechanics
- "Things we think of as waves correspond to particles; this is the first aspect of wave-particle duality. The second, complementary aspect of wave-particle duality is that things we think of as particles correspond to waves." (p73)
- discussion on the double-slit expriment (p73-75) with interference pattern, positive interference, negative interference, ...
- "The experiment reveals that the particle goes through both slits at once. An electron, a proton, a photon, an atom can be in two places at the same time." (p74)
- "Because of its underlying wavelike nature, a particle can be both <<here>> and <<there>> at the same time" (p74)
- "The bigger something is, the more interactions it tends to have with its surroundings, thus the easier it is to detect.
*In order to go through both slits at once and produce an interference pattern, a particle must pass through the slits undetected.*" (p75) - "This process of localization of the wave is sometimes called <<collapse of the wave function>>." (p75)
- "
*Observation (or measurement, as it is conventionally called) destroys interference*" (p75)- thus the potential usage in cryptography, see my notes on TheCodeBook#Chapter8

- "In other words, measurement intrinsically disturbs the particle" (p75)
- "The process by which the environment destroys the wavelike nature of things by getting information about a quantum system is called <<decoherence>>." (p75)
- "It is possible to combine waves. The resulting combination is referred to as a <<superposition>>." (p76)
- "The uncertainty principle states that
*if the value of some physical quantity is certain, then the value of a complementary quantity is uncertain.*Spin about the vertical axis and spin about the sideways axis are just such complementary quantities: if you know one, you can’t know the other." (p78) - "the Heisenberg uncertainty principle typically makes a difference only at very small scales, such as the atomic scale" (p78)
- "By applying the magnetic field, you flip the qubit." (p79)
- "By applying the magnetic field for different amounts of time, you can rotate the spin into any desired superposition of states." (p79)
- "These single-qubit rotations are the quantum analogs of single classical bit transformations, such as bitflip, or NOT." (p79)
- "When the pieces of a quantum system become entangled, their entropies increase. Almost any interaction will entangle the pieces of a quantum system." (p82)

- 6 Atoms at Work
- "The fact that atoms respond to light only at frequencies corresponding to their spectrum is useful if you want to send instructions to one kind of atom but not to another" (p91)
- "The ground and first excited state of an atom correspond to a bit. We can take the ground state to correspond to 0 and the first excited state to correspond to 1. But the atom is not just a bit; it is a qubit. The atom’s states correspond to waves, just like the states of the nuclear spins described earlier." (p91)
- "process, in which an atom keeps on absorbing and emitting photons, is called a <<cycling transition>>, because the atom cycles back and forth between two well-defined states." (p93)
- "Take an atom and zap it with a laser to put it in the superposition state |0> + |1>. Now drive a cycling transition to see if it is in the state 0 or the state 1. If it is 0, the atom will fluoresce; if it is 1, it will remain dark. You have tossed the quantum coin to create a brand-new bit." (p94)
- "Just as a quantum bit can register two values at once,
*a quantum computer can perform two computations simultaneously.*David Deutsch called this strange ability of a quantum computer to do two things at once <<quantum parallelism>>." (p95) - "This interference phenomenon is what gives quantum computation its special qualities and added power." (p96)
- note that according to "Quantum complexity theory", SIAM Journal on Computing, 26(5):1411-1473, 1997. by Bernstein and Vazirani, a quantum computer running in polynomial time can be simulated by a classical computer running in polynomial space

- "Quantum parallelism allows even a relatively small quantum computer, containing only a few hundred qubits, to explore a vast number of possibilities simultaneously." (p96)
- "In a quantum computation, if you wish to get the full benefit of the computation, you must not look at the computation while it is occurring." (p97)
- "measuring a quantum computer that is doing several things at once <<collapses the computer’s wave function>>, so that it ends up doing just one thing. Another way of describing the effect of such a measurement, though, is to say that it <<decoheres the computation>>." (p97)
- sections Factoring (p97-98) and Searching (p99) give the "classical" expected usages
- see also my notes on La révolution quantique dans Les Annees Lumiere, Radio Canada, July 2009

- "there are only a few quantum algorithms, such as factoring and searching, that are currently better than their classical analogs." (p99)
- "when applied to nuclear spins, these atom-zapping techniques are called <<nuclear magnetic resonance>>, or NMR" (p101)

- 7 The Universal Computer
- "<<Quantum simulation>> is a process in which a quantum computer simulates another quantum system" (p103)
- "Every part of the quantum system to be simulated is mapped onto a collection of qubits in the quantum computer, and interactions between those parts become a sequence of quantum logic operations. The resulting simulation can be so accurate that the behavior of the computer will be indistinguishable from the behavior of the simulated system itself." (p103)
- "It is because they tend to be doing many things at once that quantum systems are hard to simulate classically." (p103)
- "
*In a quantum computer, however, there is no distinction between analog and digital computation*. Quanta are by definition discrete, and their states can be mapped directly onto the states of qubits without approximation. But qubits are also continuous, because of their wave nature; their states can be continuous superpositions." (p105) - Simulation vs. Reality
- "a perfect description of the universe is indistinguishable from the universe itself." (p105)
- eventually see Simulation And Its Discontents, MIT Press May 2009, but it seems more oriented toward social than epistemic or scientific consequences
- "We know how to map the behavior of elementary particles onto qubits and logic operations. That is, we know how the Standard Model of particle physics -a model describing our world to superb precision- can be mapped into a quantum computer. But we don’t yet know how the behavior of gravity can be mapped into a quantum computer, for the simple reason that physicists have not yet arrived at a complete theory of quantum gravity. We do not know how to simulate the universe yet, but we may know soon." (p105-106)

- Physical Limits to Computation (p108-111) followed by The Computational Capacity of the Universe (p112-115)
- "The first fundamental limitation to computational performance comes from energy. Energy limits speed." (p109)
- "The maximum rate at which a bit can flip is governed by a useful theorem called the Margolus-Levitin theorem." (p109)
- "The Margolus-Levitin theorem says that the maximum rate at which a physical system (an electron, for example) can move from one state to another is proportional to the system’s energy; the more energy available, the smaller the amount of time required for the electron to go from here to there. The theorem is very general." (p109)

- "A quantum computer, however, always flips its bits at the maximum rate" (p110)
- "The maximum number of ops per second is given by the energy E × 4 P Planck’s constant." (p110)
- "Information can’t travel any faster than the speed of light. Because the universe has a finite age and because the speed of light is finite, the part of the universe about which we can have information is also finite." (p112)
- "when we calculate <<the computational capacity of the universe>>, what we are really calculating is <<the computational capacity of the universe within the horizon>>." (p112)
- "the horizon expands, more and more objects swim into view, and the amount of energy available for computation within the horizon increases.
*The amount of computation that can have been performed within the horizon since the beginning of the universe increases over time.*" (p113) - "To get the maximum rate at which the universe can process information, then, apply the Margolus- Levitin theorem: take the amount of energy within the horizon, multiply by 4, and divide by Planck’s constant. The result is that every second, a computer made up of all the energy in the universe could perform 100,000 googol (10
^{105}) operations. Over the 14 billion years the universe has been around, this cosmological computer could have performed about 10,000 billion billion googol (10^{122}) ops." (p113) - "Over the last year and a half, then, all the computers on Earth have performed somewhat fewer than 10 billion billion billion (10
^{28}) ops. Over the entire history of computation on Earth, computers have performed no more than twice this number of ops." (p113) - "the cosmological computer could store [...] (10
^{92}) bits of information - far greater than the information registered by all of the computers on Earth. The somewhat fewer than a billion earthly computers each have somewhat fewer than 1,000 billion (10^{12}) bits of memory space, on average, so taken together, they register fewer than 1,000 billion billion (10^{21}) bits." (p113) - "The cosmological computer can have performed 10
^{122}ops on 10^{92}bits" (p113) - "These numbers of ops and bits can be interpreted in three ways:
- 1. They give upper bounds to the amount of computation that can have been performed by all the matter in the universe since the universe began. [...]
- 2. They give lower bounds to the number of ops and bits required to simulate the universe with a quantum computer. [...]
- 3. [...] the total number of ops the universe has performed in the entire time since the Big Bang is proportional to the square of that time." (p114)

- "Paradigms are highly useful. They allow us to think about the world in a new way, and thinking about the world as a machine has allowed virtually all advances in science, including physics, chemistry, and biology." (p115)
- "I suggest thinking about the world not simply as a machine, but as
*a machine that processes information.*In this paradigm, there are two primary quantities, energy and information, standing on an equal footing and playing off each other." (p115) - "Perhaps the most important new insight afforded by thinking of the world in terms of information is the resolution of the problem of complexity." (p115)
- "In the computational universe [...] the innate informationprocessing power of the universe systematically gives rise to all possible types of order, simple and complex." (p115-116)
- "In the computational-universe paradigm, the concepts of space and time, together with their interaction with matter, are to be derived from an underlying quantum computation. That is, each quantum computation corresponds to a possible spacetime -or more precisely, a quantum superposition of spacetimes- whose features are derived from the features of the computation." (p117)
- "The wiring diagram for the quantum computation [detailed as Figure 14 page 116] dictates where information can go; it supplies a causal structure for spacetime. But general relativity tells us that the causal structure of spacetime fixes almost all features of the spacetime; just about the only feature that remains to be fixed are local length scales." (p118)
- "Einstein challenged John Wheeler to sum up general relativity in a simple phrase. Wheeler rose to the challenge: <<Matter tells space how to curve,>> he said, <<and space tells matter where to go.>> Let’s rephrase Wheeler’s dictum for the computational universe: <<Information tells space how to curve; and space tells information where to go>>." (p119)
- "The structure of spacetime is derived from the structure of the underlying computation." (p119)

- 8 Complexity Simplified
- "In the cosmological universal computer (the universal computer consisting of the universe itself), every atom is a bit, every photon moves its bit from one part of the computation to another, and every time an electron or a nuclear particle changes its spin from clockwise to counterclockwise, its bit flips." (p121)
- "At the beginning of the 1960s, computer scientists developed a detailed theory of how likely it was for a randomly programmed computer to produce interesting outputs. That theory is based on the idea of <<algorithmic information>>." (p123)
- was it applied to DNA, especially in the EvoDevo paradigm?

- "For any number, the <<algorithmic information content>> is defined as the length in bits of the shortest computer program enabling the computer to print out that number." (p123)
- "algorithmic information content provided in some ways a more satisfying measure of information than the length of a number in bits (which is another way of describing the number’s information content) because algorithmic information respects the intrinsic mathematical regularities of a number in a way that the length in bits fails to grasp." (p124)
- "The numbers that
*can*be produced by short programs are those that have mathematical regularities" (p124) - "As the number to be produced gets longer and longer, the length of the translating program [to convert from one computer languageto another] becomes, relatively, smaller and smaller, adding comparatively little length to the algorithmic information content." (p125)
- "Solomonoff used algorithmic information content to make Occam’s razor mathematically precise" (p125)
- "The probability that the random program the monkey inputs into the computer will give the first million digits of Pi as output is called the <<algorithmic probability>> of Pi. Since long programs are so much less likely to be typed correctly than short programs, the algorithmic probability is greatest for the shortest programs. The shortest program that can output a particular number is the most plausible explanation for how that number was produced." (p126)
- "the laws of quantum mechanics, which are constantly injecting new information into the universe in the form of quantum fluctuations." (p127)
- "from quantum seeds, came the universe itself.
*Quantum fluctuations are the monkeys that program the universe*." (p127) - "The opposite of entropy is called <<negentropy>>. Negentropy consists of known, structured bits. A system’s negentropy is a measure of how far away that system is from its maximum possible entropy." (p130)
- "The thermodynamic depth [named after <<logical depth>> defined by Charles Bennett] of a physical system is equal to the number of useful bits that went into assembling the system." (p131)
- "When applied to bit strings (for example, those produced by a randomly programmed quantum computer), thermodynamic depth is even closer to logical depth. The most plausible way a bit string can be produced is from the shortest program. Thus the thermodynamic depth of the bit string is the amount of memory space used by the quantum computer in producing the string; that is, the thermodynamic depth is the spatial computational complexity of the shortest program." (p131)
- "The total amount of computational effort that went into putting the universe together is 10122 ops (the logical depth) performed on 1092 bits (the thermodynamic depth)" (p131)
- "<<effective complexity>>, a measure of the amount of regularity in a system; this definition of complexity was originally proposed by Murray Gell-Mann." (p131-132)
- "
*The amount of information required to describe a system’s regularities is its effective complexity.*" (p132) - "The idea of axiomatic design is to minimize the information content of the engineered system while maintaining its ability to carry out its functional requirements." (p132)
- "the definition of purposeful behavior is to some degree subjective. But suppose we focus on behavior that allows a system to
- (a) get energy and
- (b) use that energy to construct copies of itself." (p133)

- "Observational evidence suggests that in the beginning the universe was simple. As far as we can tell, there may have been only one possible initial state, and that state was everywhere the same. If there were only one possible initial state at time zero, the universe contained zero bits of information. Its logical depth, thermodynamic depth, and effective complexity were also zero." (p133)
- "This initial revolution in information processing was followed by a sequence of further revolutions: life, sexual reproduction, brains, language, numbers, writing, printing, computing, and whatever comes next." (p136)
- "Each successive information-processing revolution arises from the computational machinery of the previous revolution. In terms of complexity, each successive revolution inherits virtually all of the logical and thermodynamic depth of the previous revolution." (p136)
- "Effective complexity, by contrast, need not accumulate: the offspring need not be more effectively complex than the parent. In the design process, repeated redesign to hone away unnecessary features can lead to designs that are less effectively complex but more efficient than their predecessors. In addition to being refined away, effective complexity can also just disappear. The effective complexity of an organism is at least as great as the information content of its genes. When species go extinct, their effective complexity is lost." (p136)
- "The effective complexity of a living system can be defined as the number of bits of information that affect the system’s ability to consume energy and reproduce. If we add to these two behaviors a third, to reproduce
*with variation*, then we can look at the way in which effective complexity changes over time." (p136)- and thus to link with creativity?

- "To the extent that greater effective complexity enhances the ability to reproduce, effective complexity will tend to grow; by contrast, if some variant can reproduce better with less effective complexity, then effective complexity can also decrease. In a diverse environment with many reproducing variants, we expect effective complexity to grow in some populations and decrease in others." (p137)
- "After all, a computation is just bits flipping in a systematic fashion" (p138)
- "Did life begin as an autocatalytic set? Maybe so. We won’t know for sure until we identify the circuit diagram and the program for the autocatalytic set that first started producing cells and genes. The computational universality of autocatalytic sets tells us that some such program exists, but it doesn’t tell us that such a program is simple or easy to find." (p139)
- see also my notes on Protocells: Bridging Nonliving and Living Matter

- "After the Big Bang, as different pieces of the universe tried out all possible ways of processing information, sooner or later, seeded by a quantum accident, some piece of the universe managed to find an algorithm to reproduce itself. That accident led to life." (p142)
- "
*It is the richness and complexity of our shared information processing that has brought us this far.*The invention of human language, coupled with diverse social development, was a true informationprocessing revolution that has substantially changed the face of the Earth." (p142) - "To paraphrase John Donne, no one is an island. Every human being on Earth is part of a shared computation." (p142)
- "We are made of atoms, like everything else. It is the way that those atoms process information and compute in concert that makes us what we are. We are clay, but we are
*computational*clay." (p143) - "Note, however, that if you assert the intelligence of the universe, you cannot deny the brilliance of one of its greatest <<ideas>> - natural selection. For billions of years, the universe has painstakingly designed new structures by a slow process of trial and error. Each <<Aha>> in this design process is a tiny quantum accident, whose consequences are elaborated by the laws of physics. Some accidents work out, others don’t. After billions of years, the result is us, and everything else." (p143)
- which seems to be entirely coherent with evolutionary epistemology

energy in dense form but low information content | `->` | energy in sparse form but high information content |

energy used | `->` | physical waste (heat, sound, ...) | `->` | potentially exploitable information (as "traces" of the process) |

Example : electrical landlines producing sound (as experienced with Sylvain in Bretagne)

Implication : creating a link between thermodynamics and information theory. An experimental process is thus extracting predicted "noise". Opening : could this also include cognition? Is "thinking" our own ability to exploit "brain noise"? Was it the same process regarding whistling, exploiting "breathing noise"? Are every computational process following this pattern?

(Montreal, November 2009)

- Seth Lloyd. Ph.D. Thesis Black Holes. Demons and the Loss of Coherence: How complex systems get information, and what they do with it
- Materials Processing Center at MIT
- course Information and Entropy at MIT
- 6.050J Information and Entropy tought by Seth Lloyd and Paul Penfield
- Information, Physics and Computation, OUP 2009
- quantiki initiated by the Centre for Quantum Computation (qubit) at Oxford and Cambridge University
- work by Jurgen Schmidhuber also mentionned in Further Reading
- Algorithmic Theories of Everything, Technical Report, IDSIA 2000
- Hierarchies of Generalized Kolmogorov Complexities and Nonenumerable Universal Measures Computable in the Limit, Internal Journal of Foundations of Computer Science Vol 13 No. 4 2002
- Computable Universes & Algorithmic Theory of Everything 2003

- books on new computational paradigms
- Information, Physics, and Computation, OUP, 2009
- New Computational Paradigms, Changing Conceptions of What is Computable, Springer 2008
- Super-Recursive Algorithms by Mark Burgin, Springer 2005

- [0908.4426] Hot Ice Computer by Andrew Adamatzky, August 2009
- Pattern Formation and Solitons (nlin.PS); Cellular Automata and Lattice Gases (nlin.CG)

- information is physical (2), computation is physical
- thus creativity is physical too

- section Information is Physical of the Entropy in thermodynamics and information theory article on Wikipedia
- B-Brain, Black Hole Brain, Orion's Arm - Encyclopedia Galactica
- reference to Information, Information Processing, and Gravity, S. Hsu

- Performance per watt "measure of the energy efficiency of a particular computer architecture or computer hardware. Literally, it measures the rate of computation that can be delivered by a computer for every watt of power consumed." according to Wikipedia
- L’univers est-il mathématique ?, Science Publique, France Culture 2008
- Entanglement, Information, and the Interpretation of Quantum Mechanics Gregg Jaeger, Springer 2009
- Quantum Algorithm Zoo summary of known quantum algorithms offering speedup over the best known classical algorithms
- partly answering the question "is there a hierarchy of the most powerful algorithm that would gain the most from such a paradigm shift?"

- Machine Learning with Quantum Algorithms, Google Research Blog December 2009
- The Second Law and Quantum Physics by Charles Bennett, MIT World 2007
- Wikipedia:Quantum_information
- Physics of Information / Quantum Information Group at IBM Research Yorktown
- including a link to Vol. 48, No. 1, 2004 - Physics of Information of IBM Journal of R & D

- Foundational Structures in Quantum Information and Computation at Oxford University Computing Laboratory
- arXiv blog: Physicist Discovers How to Teleport Energy, Technology Review: Blogs February 2010
- "There is a growing sense that the properties of the universe are best described not by the laws that govern matter but by the laws that govern information."

- No free lunch in search and optimization Wikipedia
- "In practice, almost all objective functions and algorithms are of such high Kolmogorov complexity that they cannot arise. There is more information in the typical objective function or algorithm than Seth Lloyd estimates the observable universe is capable of registering." (as of early April 2010)
- No Free Lunch Theorems Broadly speaking, there are two no free lunch theorems. One for supervised machine learning (Wolpert 1996) and one for search/optimization (Wolpert and Macready 1997).

- The Physics of Information by F. Alexander Bais and J. Doyne Farmer, SFI 2007
- reviews
- by DJ Strouse, 2008
- by John Walker, 2006
- by Yihong Ding, 2008
- by Corey S. Powell, New York Times 2006

- Finding the Most Probable Explanation using a quantum computer by Geordie, rose.blog May 2010
- Wikipedia:Ising model mathematical model of ferromagnetism in statistical mechanics.
- Part II: An example

- David Deutsch Quantum Computing Lecture-1, 2007
- Seth Lloyd's Quantum Computer. MIT 2007
- Adiabatic Quantum Computing by Dr.Suzanne Gildert, given to the Condensed Matter Physics group of the University of Birmingham, D-Wave 2010
- Limits on Efficient Computation in the Physical World by Scott Aaronson, 2004
- The Nature of Computation by C. Moore and S. Mertens, Oxford University Press 2011

- how can it be than most people have a hard time following the idea of a theory of information and physical information while they themselves use information daily through their very own physical brain?
- it seems that information processing is as obviously physical as paradoxically information itself would not be
- see the work of Rolf Landauer

- it seems that information processing is as obviously physical as paradoxically information itself would not be
- New Computational Paradigms which page 376, chapter "Computer Science, Informatics, and Natural Computing - Personal Reflections" concludes by quoting the first sentence of Programming the Universe
- ironically enough, can the daily-life question regarding conservation of simple documents, like .doc or .pdf accross time, be also a higher-order form of entropy? requiring energy in order to conserve an organized state
- based on the "added-value" of quantum computing, wouldn't it providing a much better competitive advantage to evolutionary algorithm (and all other very powerful but very costly in resources algorithm that require a very high amount of simulations of states/agents/populations)?
- is there a history of computations? (see also my Needs page)
- is it possible to correlate large computations and competitive advantage?

(:new_vocabulary_start:) to stump paucity buttress coalesce (:new_vocabulary_end:)

Draw a schema (using PmGraphViz or another solution) of the situation of the area in the studied domain after having read the book. Link it to the pre-reading model and align the two to help easy comparison.

*Back to the Menu*