The J Curve

Friday, August 20, 2004

Quantum Computational Equivalence

An interesting comment on "Your Genome is Smaller than Microsoft Office" referenced quantum effects to explain the power of the interpreters of biological code.

I recently heard Wolfram present his notion of "computational equivalence", and I asked about quantum computers because it seemed like a worm hole through his logic… but he seemed to dismiss the possibility of QCs instead.

The abstract summary of my understanding of computational equivalence is that many activities, from thinking to evolution to cellular signaling, can be represented as a computation. A physical experiment and a computation are equivalent. For an iterative system, like a cellular automata, there is no formulaic shortcut for the interesting cases. The simulation is as complex as “running the experiment” and will consume similar computational resources.

Quantum computers can perform accurate simulations of any physical system of comparable complexity. The type of simulation that a quantum computer does results in an exact prediction of how a system will behave in nature — something that is literally impossible for any traditional computer, no matter how powerful. Professor David Deutsch of Oxford summarizes: “Quantum computers have the potential to solve problems that would take a classical computer longer than the age of the universe.”

So I wonder what the existence of quantum computers would say about computational equivalence? How might this “shortcut through time” be employed in the simulation of molecular systems? Does it prove the existence of parallel universes (as Deutsch concludes in Fabric of Reality) that entangle to solve computationally intractable problems? Is there a “quantum computational equivalence” whereby a physical experiment could be a co-processor for a quantum simulation? Is it a New New Kind of Science?

8 Comments:

  • Steve,

    When I mentioned DNA being executed on a quantum computer (of some type) I did mean just that, not just analog, non-linear and complex, though it was, indeed, a little bit of a tongue-in-cheek comment, hinting at the fact that at that level it is not natural at all to differentiate "hardware" and "code" (one can argue that the difference is pretty much moot even in most of traditional digital systems in use today, except for the explicit PCs ;-) ). Who knows what co-evolution of hardware and software can produce in billions of years...

    But then, again, how does one write a (non-trivial) program for a QC? When I sit down to write a C program I might say const int answer=42; to declare and initialize a quantum constant one would have to use quite an expressive language and appropriate storage media (DNA?)!

    And, to confuse things even further, you mention interpreters of biological code in your most recent post (when I read "interpreters" I think of software); and what puzzles me is that I myself originally wrote "virtual machine it runs on" and then, after some thought, replaced it with "hardware" -- but why do we instinctively assume that there is something even more fundamental underneath? Some cellular automaton described in "4 lines of Mathematica code"? Or not?

    I guess we are getting too close to the centuries old question of pure determinism vs. SOMETHING that adds free will into the world, and if "God plays dice". Stephen Wolfram seems to be on the deterministic (~=digital) side, while quantum theory so far seems to lean to the other side. (Disclaimer: I wish I'd be back to the graduate school and have a chance to lay on my sofa and actually _read_ the whole NKS ;-) ).

    Paul B.

    P.S. Another source of confusion is that when I read about "computational equivalence" I instinctively interpret it as "equivalence to a Turing machine", while Wolfram is talking about "equivalence to the real world". If those two are the same is an open question, the way I read
    this
    is that he thinks those are the same, as in here:
    But it was not until the 1980s--perhaps particularly following some of my work--that it began to be more widely realized that Church's Thesis should best be considered a statement about nature and about the kinds of computations that can be done in our universe, though he admits that among physicists there are still nagging doubts, mostly revolving around the perfect continua assumed in space and quantum mechanics in the traditional formalism of theoretical physics, those physicists who claim that quantum computers are possible! ;-)

    P.P.S. I am not sure if I am a physicist or a computer scientist (being trained in both fields), but I think I have my right to make fun of both groups equally! ;-)

    By Blogger Paul B., at 12:13 AM  

  • Steve,

    Your comment about quantum computing and Wolfram's Principle of Computational Universality is actually quite astute. It ties in with a more general limitation of Wolfram's principle, which I have described in a section of my review of his book. Check out the section on "Strong and Weak Computational Universality" in

    http://www.goertzel.org/dynapsyc/2002/WolframReview.htm

    Basically, the fatal flaw in his universality principle is that he doesn't consider issues of computational efficiency (in either the space or the time domains). So what if a CA can display every possible complex phenomenon that a brain, if it takes a billion years and a CA the size of Andromeda to do so? So what if the brain can in principle display every complex phenomenon that a superhuman AI can -- if it will take 47 times longer than the expected countdown to the Big Crunch to do so, and will require an external memory source 98.6 times larger than the size of the universe?

    Deutsch showed that quantum computers can't compute anything going beyond ordinary Turing computable functions -- but they can compute some things a lot faster. This means that, in principle, a classical computer -- or a brain, or a CA, or whatever -- could display the same funky complex patterns as a quantum computer. But it might take, oh, 10^777 times longer and require 888^888 times more memory ;-) Does this make a difference to you? It seems not to make a difference to Wolfram.

    One perspective is to dismiss concerns about efficiency as quibbles, but in fact I think they're central. In my view the assessment of simplicity in space and time is just about the most basic thing in the metaphysics of intelligence -- as well as obviously being critical from a practical perspective. As Eric Baum has argued in his recent book "What Is Thought?" (which I also have some problems with, but which is significantly more sensible, though less inventive, than Wolfram's book), it's the principle of Occam's Razor that allows minds with limited physical resources to achieve reasonable amounts of intelligence. Occam's Razor is all about making estimates of computational efficiency.

    As a somewhat related aside, I have been very entertained lately by Saul Youssef's papers on quantum probability theory. It seems that you can derive basic quantum mechanics from the sole assumption that probability values are not real numbers but rather complex numbers. Funky, huh? Alas, my practical work trying to create a thinking machine and decode cell dynamics and so forth doesn't leave me time to do so, but I think it would be possible to create a new theory of *probabilistic reasoning* using complex-valued probabilities rather than the ordinary real-valued ones. This would be very interesting, and might be a way of formalizing the "logic" that would be used by a quantum-computing-based intelligent mind (unlike Penrose and some others, I don't believe that the human brain is a quantum computer in any strong sense, although I do think it's possible that macroscopic quantum effects have some effect on brain dynamics).

    Well, I think I rambled on a bit here -- sorry, it's early morning and my brain is still partially in the zone of dreams -- but I believe I've answered your question, at least by pointing you to my prior review that addressed a similar point ;-)

    Cheers,
    Ben Goertzel
    ben@goertzel.org

    By Anonymous Anonymous, at 4:03 AM  

  • Interesting review, Ben!

    Paul B.

    By Blogger Paul B., at 11:15 AM  

  • My genome may be smaller then Microsoft Office, and more important, my cells nuclei are much smaller then the CD where my genome could be stored, if someone had the opportunity to sequence it.
    But if the variants of expression per differentiated cell type I have all over my body, and those specialized cells in my thymus playing combinatorial molecular biology to ensure the availability of convenient antibodies should be store in a CD-like physical support you should use a fortune to pay for them, even using a strong compression algorithm.
    If you expand the "genetic material" to RNAs, then you will need some more space to take in account RNA splicing and editing. Then at the protein level, where most of the genetic material express itself the data to be stored are just HUGE. I wonder what would be the volume of DVDs necessary to store proteins configuration changes following chemical alterations, maturation/activation by cleavage etc.
    And this is the point where I would stop and compare with a software suite. Not just the code vs DNA.

    By Blogger OldCola, at 2:55 AM  

  • Good to see you here Old Cola.

    If you include the extra genetic code (mitochondrial DNA and even symbionts like E.Coli), the size comparison still holds since they are relatively tiny. I think your comment hits at the core of the Your Genome is Smaller than Microsoft Office post.

    It seems to me that the physical instantiation and the unpacked complexity are central to the original question: “What does this tell us about complex systems development in general?” I think the code compactness and degree of code (and subsystem) reuse across diverse organisms draws a sharp contrast between the seed vs. feed developmental approaches.

    The code alone does not begin to explain the complexity of regulatory gene feedback, protein interactions, metabolism, or the iterative application of physical, chemical and electrical feedback in “expanding” the code.

    It’s like the Mandelbrot fractal sets or , cellular automata rule 30. Each are uniquely captured in a single line of code (or a single number), and each unpack with an iterative interpreter into a visual display of beauty. Applying each rule a trillion times to create a visual display of a trillion pixels could be said to be a trillion bits of code. They are different levels of instantiation and different levels of abstraction. In these systems, a single line of code can iterate into massive “data sets” when looked at from the end product perspective.

    Our software traditions are very different. Microsoft Office is more likely to include a space-hogging library of clip art than the seed code to create beauty.

    By Blogger Steve Jurvetson, at 7:21 PM  

  • Stephen Wolfram: A New Kind of Science

    "For three centuries scientists have looked to mathematics to understand how nature works -- and to create the foundations for technology. But based on his surprising discoveries about simple computer programs, Stephen Wolfram, creator of Mathematica and author of A New Kind of Science, has developed a radically new approach. From the IT Conversations and SDForum Distinguished Speaker Series."

    http://www.itconversations.com/shows/detail202.html

    By Blogger Dimitar Vesselinov, at 12:30 PM  

  • I am not versed in computer science, let alone, quantum computing. Notwithstanding, please allow me these thoughts. "Quantum computers can perform accurate simulation of any physical system of comparable complexity," if and only if that physical system can be reduced to precise mathematical representations. Some physical systems can be easily represented. Others cannot not. Perhaps quantum computers initial task should be to characterize these very complex physical systems in order to predict how they behave in nature. Moreover, if algorithms can be derived from this process, than perhaps classical computers, can indeed, solve such questions. I do not believe quantum physics and classical physics are mutually exclusively. Until we establish that relationship, quantum computing may be genuinely limited. The questions that are being proposed, the existence of parallel universes, and such, surely do not, merely exist, if in fact they do exist, only in the quantum realm of understanding. As such, I am suggesting that quantum computers, must, of necessity work synchronously with classical computers in order to model reality and predict how physical systems operate in nature.

    By Anonymous John L. Brown, at 8:44 PM  

  • I am not versed in computer science, let alone, quantum computing. Notwithstanding, please allow me these thoughts. "Quantum computers can perform accurate simulation of any physical system of comparable complexity," if and only if that physical system can be reduced to precise mathematical representations. Some physical systems can be easily represented. Others cannot. Perhaps quantum computers initial task should be to characterize these very complex physical systems in order to predict how they behave in nature. Moreover, if algorithms can be derived from this process, than perhaps classical computers, can indeed, solve such questions. I do not believe quantum physics and classical physics are mutually exclusively. Until we establish that relationship, quantum computing may be genuinely limited. The questions that are being proposed, the existence of parallel universes, and such, surely do not, merely exist, if in fact they do exist, only in the quantum realm of understanding. As such, I am suggesting that quantum computers, must, of necessity work synchronously with classical computers in order to model reality and predict how physical systems operate in nature.

    By Anonymous John L. Brown, at 8:52 PM  

Post a Comment

<< Home