Quantum Computing Since Democritus Lecture 13: How Big Are Quantum States?

A year and a half after the actual course, the remaining Democritus lecture notes are finally being transcribed and posted — thanks to my new summer student, Chris Granade. In Lecture 13, you can read about why QMA, QCMA, and BQP/qpoly are not merely complexity classes but battlefields, where competing visions of what quantum states really are face each other off in conflicts that we as theorists intentionally provoked. (Work with me here.)

47 Responses to “Quantum Computing Since Democritus Lecture 13: How Big Are Quantum States?”

  1. hronir Says:

    Great piece of news!!!

  2. John Sidles Says:

    Hronir Says: Great piece of news!!!

    Yeah, but then in the very first paragraph, Scott disrespects “soil engineering.”

    Scott, Scott, Scott … when will you understand that of mathematics, physics, and soil engineering … the toughest and most ennobling of the three by far is soil engineering? 🙂

    To see a world in a grain of sand,
    And a heaven in a wild flower,
    Hold infinity in the palm of your hand,
    … And eternity in an hour.

    When gold and gems adorn the plow,
    To peaceful arts shall envy bow.
    A riddle, or the cricket’s cry,
    … Is to doubt a fit reply.

    Or as Ed Wilson said it in his autobiography Naturalist

    If I could do it all over again, and relive my vision in the twenty-first century, I would be a microbial ecologist.

    Ten billion bacteria live in a gram of ordinary soil, a mere pinch held between thumb and forefinger. They represent thousands of species, almost none of which are known to science.

    Into that world I would go with the aid of modern microscopy and molecular analysis. I would cut my way through clonal forests sprawled across grains of sand, travel in an imagined submarine through drops of water proportionally the size of lakes, and track predators and prey in order to discover new life ways and alien food webs.

    All this, and I need venture no more than ten paces outside my laboratory building.

    The jaguars, ants, and orchids would still occupy distant forests in all their splendor, but now they would be joined by an even stranger and vastly more complex living world virtually without end.

    For one more turn around I would keep alive the little boy of Paradise Beach who found wonder in a scyphozoan jellyfish and a barely glimpsed monster of the deep.

    Seriously (except the above is serious too) I look forward very much to the reading the rest of the lecture.

    But gosh, no more “dissing” of soil science, please! The QIT community as yet can only dream of someday approaching the nobility of soil science! 🙂

  3. John Sidles Says:

    Scott, the lecture “How big are quantum states” is an outstanding conclusion to Quantum Computing Since Democritus. And the last lecture in the series Great Ideas in Theoretical Computer Science (on learning theory and sample complexity) is also outstanding.

    Like all good lectures, both of the preceding leave the audience (me, anyway) hungry for one more lecture. Specifically, a lecture on your article The Learnability of Quantum States, that united the threads of the two preceding lectures, would be very welcome and interesting.

    And if you could include in this unifying lecture some discussion of your views on compressible objects and compressive sampling—a discipline that at present is predominantly yet needlessly classical—the resulting lecture IMHO would be hugely interesting to many people.

    Of course, it may well be that neither you nor anybody else understands the intersection of these three subjects. But heck, when has mere ignorance ever stopped a professor from giving a lecture! The lecture would be outstanding! 🙂

  4. Scott Says:

    Specifically, a lecture on your article The Learnability of Quantum States

    John, it’s a-comin’ — specifically, QCSD Lecture 15! (Though I don’t really go into compressive sampling there — I’ve told you what I see as the connection between that and learnability of quantum states, and it’s basically negative in character.)

    Incidentally, I meant no disrespect to soil engineering, just as I’m sure physicists mean no disrespect to CS when they place it alongside soil engineering! 🙂 It’s only for the sake of truth that I wished to set things straight.

  5. John Sidles Says:

    Scott, I can’t wait for QCSD #15! Oh boy!

    As for the respective merits of soil engineering versus CS/QIT, please rest easy … my tongue was firmly in-cheek … which is where it tends to lodge whenever serious topics are discussed. 🙂

    And as for the connection—or lack thereof—between (classical) compressible objects and (quantum) state learnability, I am reasonably confident that clarifying connections will emerge fairly soon … for the heuristic reason that plenty of smart people are working in both fields, and they are bringing to bear overlapping mathematical and informatic tool-sets.

    Metaphorically speaking, it seems to me that “Bob the quantum information theorist” is like a hormonally charged high-school student who finds himself cast in the same play with the highly attractive “Alice the compressible object specialist.” It is a near-certainty that Alice and Bob are going to discover that they have plenty of mutual interests. 🙂

  6. matt Says:

    Scott, what’s this about “Ray has me on board”. Are you going to IQC?

  7. Scott Says:

    Matt, I was at IQC when I taught the course (a year and a half ago).

  8. csrster Says:

    Too much to read, too little time.

    I think I may have to print out all your lecture notes and take them away with me to read on my summer holidays.

  9. Robin Hanson Says:

    Consider a theory like eternal inflation. If you have grounds for believing the theory, you believe in an exponential amount of reality out there, even though you have no way to translate that reality into effective computation to solve exponential problems. So why can’t quantum reality also be exponentially big, even if you also can’t use it to compute exponential problems?

    It sounds like you might be creeping toward a quantum horizon concept, showing that much of this exponential reality quickly moves beyond one’s ability to usefully touch it, just as in eternal inflation most of spacetime is beyond your ability to usefully touch it.

    You describe the Copenhagen view as the view that the exponential stuff in our usual quantum description is all in our heads, not out there. But if our only actual candidate descriptions of what is out there do contain an exponential amount of stuff, shouldn’t our default view be that this is the kind of thing that is really there? The Copenhagen view seems just a faint hope that someone will someday find a much smaller description of what’s really there. You could have such faint hopes for eternal inflation as well, but would it make sense to talk about the Copenhagen interpretation of eternal inflation wherein reality is much smaller than it might appear?

  10. John Gordon Says:

    Please skip ahead a bit, and give us your Free Will lecture next …

  11. Raoul Ohio Says:

    The Copenhagen View might be old hat, but it is the basis of applied QM, which includes Solid State Physics, which includes transistors. I have around E10 to E12 transistors on my desk right now, with mass under a KG, designed with the CV. They are doing just fine without any inflation, dark energy, or MWW (Many Worlds Whatever).

    Here are my guesses for the probabilities of several topics being taken seriously by most physicists in 2058:

    CV: 0.99
    Dark matter: 0.80
    Dark energy: 0.45
    Quantum computers: 0.45
    Inflation: 0.25
    MWW: 0.05

    Bets, anyone?

    When unable avoid a discussion with a “Creation Scientist”, you are likely to hear: “Evolution is just a theory”. In this situation I reply, “so is QM. Please use the axioms of your chosen religion to design enough transistors to rig up a calculator that passes the ROCT (Raoul Ohio Calculator Test): When I enter ‘2 + 2 =’, the output must be ‘4’. Then I will give serious consideration to your views”. This has not kept me too busy.

  12. Chris Granade Says:

    Robin: If I may attempt to address your question, please consider the set U of all 2×2 unitary matrices over ℂ. Each such matrix is (typically) represented by some tuple in ℂ4, but there are not that many elements in U, since (1, 1; 1, 0) ∉ U. Obviously, trying to draw an analogy to quantum states breaks down quickly, but the point that I’m trying to get to is that the representation of an object is not the same as the informational capacity of an object.

    As Lecture 13 suggests (at least to my reading), it is likely the case that with quantum states, in most familiar intuitive senses of the word “information,” an n-qubit quantum state gives us access n bits of classical information, but with some sense of additional structure that we can exploit if we’re clever enough. To some extent, I think that, in order for the size of quantum states to make sense, the intuition behind the word “information” needs to change and become more in line with the rigorous definitions used by information and complexity theorists.

    I’m sure that once Dr. Aaronson has the time, he can give you a much better explanation. but I figured I might as well give it a go. If I’m off the tracks, then at least I tried to make sense of it.

  13. Greg Kuperberg Says:

    First, thanks for the two citations of me in this lecture!

    Second, some remarks about the Bayesian interpretation of quantum mechanics. The Bayesian interpretation is not quite exactly the same as the Copenhagen interpretation, although it is similar. In the Copenhagen interpretation, a wave function or (pure) quantum state yields probability distributions. In the Bayesian interpretation, a quantum state can be mixed or pure, and it is a generalization of a probability distribution.

    Being a generalization of something is not conceptually the same as being a source. For instance, if you were a geometer familiar only with triangles, you could conceive of a polygon as an important source of triangles. That is not entirely wrong, because one good way to describe a polygon is by triangulating it. But a better interpretation is that a polygon is a generalization of a triangle.

    To understand why a mixed state is a generalized probability distribution, you should consider what a probability distribution is in the first place. In (modern) probability, a probabilistic system is a commutative algebra of random variables. A distribution is, by definition, a linear map from this algebra to scalars, with the interpretation that the value of a random variable is its expectation. If the variable is 0-1-valued, then you can call it Boolean and its expectation is its probability. As you might expect, you obtain quantum probability by dropping the word “commutative”.

  14. Robin Hanson Says:

    I think I should have said “Bayesian view” instead of “Copenhagen view.”

    Greg, a probability distribution is a description of our knowledge about a system. But is a mathematical generalization of a probability distribution, where we drop the word “commute”, also a description of our knowledge, or is it something else entirely?

    Chris, I wasn’t trying to talk about “info capacity” but about how must stuff is actually out there.

  15. Scott Says:

    At Brookhaven National Lab, which has particle accelerator but not working Internet access in room. On Blackberry. Will write more later.

    Is there anyone today who calls themself a Copenhagenist and explicitly rejects the Bayesian viewpoint? (The people I’ve met who make Bohr-like moves all seem to call themselves Bayesians…)

  16. Yatima Says:

    I thought the Bohmian interpretation had been packed away and shipped out?

    …and so had the many-worlds approach?

    Stop confusing me.

  17. Scott Says:

    Yatima: Have you read David Deutsch or Eliezer Yudkowsky? They think everything besides many-worlds has been packed away and shipped out by now. The proponents of Bohmianism, like Sheldon Goldstein, speak in similar terms about their view. In short, all the various views seem alive and well to me, except in the minds of the adherents of any one of them.

  18. Scott Says:

    Please skip ahead a bit, and give us your Free Will lecture next …

    Sorry, John! The order of the lectures was determined in advance, and I lack the power to change it…

    (You totally set me up for that one.)

  19. Scott Says:

    Raoul: You need dark matter and dark energy for perfectly hardheaded reasons — not to design transistors, but to account for astronomical observations. They’re not some passing fad. Inflation is less well-established, but is also motivated by astronomical observations and makes falsifiable predictions (so far confirmed).

    And of course, transistors could be designed just as well under MWI as under the Copenhagen view. They make the same predictions!

  20. Michael Bacon Says:

    Yatima,

    The link you provide criticizing the MWI has this telling quote: “[t]he idea that the full details of the observer should be included in the Hilbert space is in violation of the scientific ethos.”

    Really? I find it hard to believe many scientists believe this. Perhaps someone can enlighten me in this regard.

  21. Scott Says:

    Robin: You ask good questions without short answers.

    I suppose the “Copenhagen interpretation of eternal inflation” would say that anything beyond our cosmological horizon either doesn’t exist, or else is meaningless to talk about.

  22. Scott Says:

    Chris: Your second paragraph is a fair summary — but what’s this “Dr. Aaronson” business? 😉

  23. Michael Bacon Says:

    You say “. . . all the various views seem alive and well to me, except in the minds of the adherents of any one of them.”

    I sense that you are an agnostic when it comes to adopting any particular interpretation. But if you HAD to choose one interpretation that you were (relatively) comfortable with, what would it be Dr. Aaronson? 🙂

  24. Scott Says:

    Michael: Is “Wait-For-The-Next-Striking-Insight-Like-Bell’s-Theorem-Or-Quantum-Computing-That-Makes-Every-Side-Reshuffle-Its-Cards” an interpretation? If not, then I guess some variant of decoherent histories or many-worlds (which as I think Weinberg said, are “terrible except for the alternatives”).

  25. Greg Kuperberg Says:

    But is a mathematical generalization of a probability distribution, where we drop the word “commute”, also a description of our knowledge

    Yes! This point of view is eventually woven into the terminology of quantum probability. A mixed state has a well-defined entropy — it’s called von Neumann entropy but it’s basically the same as Shannon entropy of a distribution. If it carries entropy, then it is a description of knowledge.

    Is there anyone today who calls themself a Copenhagenist and explicitly *rejects* the Bayesian viewpoint?

    Most Copenhaginists expressly ignore either part or all of the Bayesian viewpoint. Ironically, most of those who don’t are mathematicians working in an area such as operator algebras rather than physicists.

    In the standard textbook treatment of quantum mechanics, a pure quantum state is called a wave function and is treated as “real”. Mixed states are introduced much later and they are instead called “a formalism”. The standard explanation is that you can summarize an ensemble of wave functions, i.e. a classical distribution on pure states, by the “formalism” of density matrices. That is already a tacit rejection of quantum Bayesianism. In order to accept a quantum state as a generalized probability distribution, you have to have the operation of taking a marginal of a joint distribution. Obviously mixed states are closed under taking marginals, while pure states are not. Moreover, if you do accept the fully Bayesian view, an ensemble of quantum states is a redundancy that shouldn’t usually be discussed: it is a distribution on distributions. (It is not redundant if you can draw more sample from the ensemble; but then, a distribution on distributions is also fine if you can draw more than one sample.)

    Most quantum information theorists are much closer to Bayesians in my sense. However, even most of them maintain an artificial dichotomy between classical and quantum probability. They would perhaps appreciate but not typically use the non-necessarily-commutative mutual generalization. For instance, there is a (good) paper in quantum information theory that defines “decoding instruments” as the realistic class of maps from a qudit to a joint system consisting of a qudit and a classical digit. The decoding instrument is a list of completely positive maps that sum to a quantum operation. I pointed out that this “decoding instrument” does not need a new name: It is just a quantum operation from a fully quantum system to a partly quantum system. The author said that this was a good remark, but it is still not standard usage in quantum information theory.

    It is, however, standard usage in operator algebras. When Stinespring defined completely positive maps in 1955, he expressly did include the commutative special case, i.e., Markov maps or stochastic matrices. But the operator algebraists of the day were not doing physics. When physicists discovered Stinespring’s paper a decade later, the unified view of quantum and classical state was overlooked.

  26. John Sidles Says:

    Scott asserts: And of course, transistors could be designed just as well under MWI as under the Copenhagen view. They make the same predictions!

    With great respect, Scott, that assertion is most definitely not true. By analogy, would any mathematician ever assert “And of course, integers can be factored by trial division just as well as by Lenstra elliptic curve factorization or by the general number field sieve. They make the same predictions!”

    Never! And there are two reasons why not: the practical reason that trial division is grossly inefficient, and the deeper reason that seeking efficiency in factoring is a good path to a deeper understanding of the integers themselves.

    Isn’t quantum physics the same? Interpretations of quantum mechanics vary hugely in their practical simulation efficiency, and seeking efficiency in simulation is a very good path to deeper understanding of quantum mechanics.

    The main difference is, our theoretical and practical understanding of efficient quantum simulation … and efficient state compression … and efficient quantum state learnability … is much less-developed than the theory of integer factoring … although of course all of these remain deeply mysterious! 🙂

  27. Greg Kuperberg Says:

    I guess I should qualify my previous comment. Most Copenhagenists are pure-state Copenhagenists, which as I explained expressly ignores the mutual generalization of quantum and classical probability. But it is true that “most” pure-state Copenhagenists — arguably all of them by definition — accept an essentially Bayesian interpretation of wave functions. There is nothing bothersome about “wave function collapse” if you accept it as at least analogous to Bayesian conditioning. My advice would be to accept it as Bayesian conditioning, not merely analogous to it. Most Copenhagenists don’t go that far. (*)

    Indeed, some of them are distinctly more Bayesian in the context of quantum theory than in the context of classical probability! It is impossible to make a rigorous distinction between “true” randomness generated by quantum measurements and “apparent” classical randomness — they may be different in examples but in the end they blur together. But I have seen people attempt a rigorous distinction.

    (*) My wife Rena Zieve and I have an inside joke that the Gram-Schmidt algorithm applied to functions, for instance to produce orthogonal polynomials, is “highly analogous” to Gram-Schmidt for vectors. I.e., a student textbook that doesn’t admit that functions are a special case of vectors, also can’t admit that functional Gram-Schmidt is a special case vector Gram-Schmidt.

  28. Peter Shor Says:

    John Sidles brings up an interesting question: should we consider computational complexity issues in worrying about the laws of nature? My answer would be “no.” The history of physics has generally taught us that the laws of nature are simply described, i.e., these laws are low in Kolmogorov complexity.

    Are they also easy to simulate? Our experience seems to be that this is not necessarily true. Even ignoring all the issues of quantum mechanics and BQP, we have only just now discovered feasible ways to simulate general relativity, and it’s fairly expensive computationally (although I believe in some sense still polynomial time).

    Why does Nature behave this way? Misquoting Dr. Seuss, I’ll say “don’t ask me, go ask a philosopher.”

  29. Greg Kuperberg Says:

    we have only just now discovered feasible ways to simulate general relativity, and it’s fairly expensive computationally

    But this could be because the problem is intellectually challenging rather than intrinsically hard. GR and numerical simulation of PDEs are both complicated topics, and there are not all that many people who are truly expert at both of them. And while the problem is certainly interesting, it isn’t a money maker.

  30. John Sidles Says:

    Peter, your reply reminds of a (very old, Victorian) joke: “Suitor: Do you like Kipling? Blushing maiden: I don’t know, I’ve never kippled!” … this joke was pretty racy stuff a century ago.

    The point being, until very recently the science-and-technology community has lacked the mathematical tools that are needed to cast an informatic light on the laws of nature … we are still beginners at “kippling.” 🙂

    When I learned physics, great emphasis was placed on symmetries and the conservation laws that symmetries imply (via Noether-type theorems) . Nowadays, from a quantum informatic point-of-view, we appreciate that symmetries and conservation laws are two very good approaches to reducing state-space dimensionality, and thus achieving low Kolmogorov complexity … but these by no means exhaust the list!

    In the light of modern QIT, what other mechanisms might be added to the list of mechanisms that reduce quantum Kolmogorov complexity? Surely noise can be added … along with the noise’s close cousin, thermal contact with a low-temperature reservoir … also observation-and-control … and it is very interesting to keep asking “what others?”

    A pragmatic approach that the “L_1 magic” folks are using with great success in the classical domain is not to ask about the origins of low Kolmogorov complexity—which they call “compressibility”—but instead to assert it as a starting mathematical assumption, and inquire as to what interesting theorems can then be proved … and it is surprising how beautiful these theorems are, and useful these theorems are in practical applications … as readers of Terence Tao’s blog know! 🙂

    It is in this spirit that I am eagerly looking forward to Scott’s discussion of quantum learnability … perhaps Scott has found yet another mechanism for specifying and/or understanding quantum states of low Kolmogorov complexity … or perhaps he is importing to the quantum domain some of the classical techniques that the “L_1 magic” folks are discovering.

  31. Robin Hanson Says:

    Greg, the fact that you can define a generalization of entropy to match your generalization of probability does not by itself ensure that it is a description of our knowledge about some reality. To be sure the fact that it is useful in making predictions means it has some relation to some knowledge about something, but that relation could be quite indirect and convoluted.

    We have a standard account of what it means to have info about and a probability distribution over some set of possible realities, and this account clearly distinguishes possibilities, knowledge of them, and probability distributions reflecting this knowledge. For any structure which is said to describe knowledge of possibilities, we want to see either a mapping into this standard account, or a new account of the relations between possibilities, knowledge of possibilities, and descriptions reflecting this knowledge.

  32. Scott Says:

    My wife Rena Zieve and I have an inside joke that the Gram-Schmidt algorithm applied to functions, for instance to produce orthogonal polynomials, is “highly analogous” to Gram-Schmidt for vectors.

    Thanks for sharing, Greg! Next time I want to get your family in stitches, I’ll crack the one about Gram-Schmidt applied to produce orthogonal polynomials being highly analogous to Gram-Schmidt for vectors. 😉

  33. Greg Kuperberg Says:

    Robin: Maybe I no longer understand your question. Classical information theory is the orthodox model of statistical knowledge. Quantum information theory is a non-commutative (or not-necessarily-commutative) generalization that still works. It works mathematically and it works physically. If you are a Bayesian, it works conceptually. I can’t think of a way in which the theory is indirect or convoluted, although I’ll grant that it’s surprising. In many key respects it’s the same as before.

    Scott: I realize that it sounds incredibly nerdy and obscure. All I can say is that it got funnier with each iteration in the context of actually teaching the material.

  34. Gil Kalai Says:

    A few quick remarks regarding the interesting issue of complexity, simulations, and the laws of nature.

    The difficulty in simulation is often a difficulty in modeling. If we do not know the precise model (or the model contains some “secret ingredients”) we have no reason to believe that simulation will be easy, even if nature can run the process easily.

    This comment applies not only to “computational complexity” but to the more basic level of how well can we simulate, given our knowledge, even with an unlimited computational power. (Sort of “learnability”.)

    My impression is that Scott does worry about computational aspects of the laws of nature, and while I do not understand or agree to every specific argument that Scott makes, making the connection seems very interesting. Also the overall belief that every process in nature is computationally feasible (in principle) seems very reasonable. As Peter said, making these connections invites a lot of philosophy, which is only for the better, isn’t it.

  35. John Sidles Says:

    That’s a fine post, Gil, with which I wholly agree. But gosh-golly, what Scott’s post asked of us wasn’t a civilized tea-drinking with-raised-pinkie dialog, but rather

    “… battlefields where competing visions of what quantum states really are face each other off in conflicts that we as theorists intentionally provoked (work with me here)”

    And I can picture Scott chuckling to himself, when writing the above, (like the Duke in Huckleberry Finn):

    “There,” says he, “if that line don’t fetch them, I don’t know Arkansaw!

    So tell you what … over the next 24 hours I’ll see if I can’t whomp up a one-screen quantum-state manifesto that’s a little bit more provocative … and perhaps it will include some block-headed ignorance to spice things up … that shouldn’t be too hard! … and perhaps other folks will too .. in courtesy to Scott for inviting us to this performance of The Royal Nonesuch 🙂

  36. Jonathan Vos Post Says:

    “… every process in nature is computationally feasible (in principle) seems very reasonable.”

    How else can the universe compute its own next state within its own lifetime?

    I agree in advance that this question sweeps under the rug the (SR) non-existence of simultaneity, the complicated relationship between local and global processes, and the question of whether the universe has discrete space-and-or-time versus continuous.

    But, as I’ve been saying since 1968 (and it cracked up Feynman) the first time I told him): The universe can be simulated by a computer larger or slower than the universe, which in turn can be simulated by a computer larger or slower than the universe… The universe we really live in is the least-action fixed point of such recursions.

  37. Gil Says:

    Three little remarks (which may come short, unfortunately, to fulfill your expectations, John)

    1) After years of being exposed mainly to the noncommutative probability point of view I am curious if in the context of QM (and QC), crucial things (facts/intuitions) are lost if you do not think about Hamiltonians/waves. etc.,
    Are the different approaches to the foundation of QM really make a difference? (Greg converted me so well that I tend to regard non commutative probability as a piece of mathematics which is basic to physics just like classical probability theory.)

    2) While formally more general than classical probability the study of non commutative probability (per se) seems like a very nice small area within probability theory. There were some hopes by Scott/ Greg and others that a “noncommutative probabilistic method” will emerge mimicking some of the amazing successes of the “probabilistic method” in other areas of mathematics. (Perhaps an obstacle to this idea is that the non commutative probability is quite similar in many respects to the classical one.)

    3) The foundation of (classical) probability theory are of much interest also. When it comes to physical reality it is not clear that (outside QM) there is any meaning to classical probability beyond expressing human uncertainty (at least, this is one of the views regarding classical probability), so perhaps non commutative probability is the only context where probability itself has an objective physical meaning.

  38. John Sidles Says:

    I’ll summarize my not-very-sophisticated point of view in prose, because the libretto for my quantum opera is progressing quite slowly. 🙂

    With regard to non-commutative probability, it’s on my list of things to learn, but not as high on that list as Scott’s quantum learnability theorems … but the plain fact is that at present I’m pretty completely ignorant of both.

    When it comes to quantum randomness, it seems to me that the ideas of Kolmogorov-Chaitin complexity are reasonably satisfactory.

    E.g., the interferometer in our MRFM experiments delivers a binary stream of bits at about 10^12 bits/second. Heck, of course these data records are incompressible in leading order (meaning random). They have to be. Because the theorems of Kolmogorov-Chaitin complexity guarantee that just quantum systems, but system theory that encompasses an exponentially large number of measured outcomes (loosely speaking), necessarily will encompass randomness in those measured outcomes.

    So for us engineers, the question “How Big Are Quantum States” boils down to “How Compressible Are Quantum States?” And this means compressible not only in principle, but in practice. After all, the above-mentioned MRFM interferometer is a machine for creating compressible states … it is designed to create back-action that drags the quantum state into agreement with the measurement record … because that is how measurement works … measurement processes reflect design … whether that design is via cognition or via Darwinian evolution. 🙂

    For most practical purposes—quantum computation being the main exception—it is highly desirable that quantum states be compressible, and that computations be feasible in the compressed representation. `Cuz duh, “Whereof one cannot speak, thereof one must be silent” applies in science and engineering just as much as in philosophy.

    That is why quantum physicists and quantum chemists and quantum system engineers spend pretty much all their time talking and/or computing with compressible quantum states that have computable representations.

    This has created an embarrassment of riches: the empirical methods that people use in practical quantum computations are so numerous and various as to form a dense fog of literature that makes it hard to discern the outlines of the informatic subject that we are all really studying.

    Therefore—from purely practical motivations—I am particularly interested in fog-dispelling informatic tools, and I am hopeful that Scott’s notions of quantum learnability can be added to humanity’s fog-dispelling tool-set.

  39. John Sidles Says:

    Gee, now I have “poster’s remorse”. Minor remorse for the two typos in the sentence that should have asserted

    “The theorems of Kolmogorov-Chaitin complexity guarantee that not just quantum systems, but any system that encompasses an exponentially large number of measured outcomes (loosely speaking), necessarily will encompass randomness in those measured outcomes.

    And major remorse for not posing the debatable topic:

    RESOLVED: The description of quantum measurement in terms of von Neumann-style projection operators should be abolished from undergraduate courses in quantum mechanics.

    I’m happy to argue the affirmative … and hopefully this respects Scott’s request that we “work with him!” 🙂

  40. Raoul Ohio Says:

    Scott,
    Re: DMDEI (dark matter, dark energy, inflation)
    I hear what you are saying, but these might prove to be “I can calculate, therefore it happened” bandwagons. Careful modeling of galactic dynamics and cosmic evolution produce results at odds with observation. The only thing certain is that something is wrong. Maybe we don’t understand the physics; maybe we are missing a major player, whatever. The natural response is to postulate a hidden (particle, force, phase transition, whatever) that fixes things. In some cases parameter fiddling produces a prospect (DM, DE, I, …) that is consistent with observations. The fun part is figuring out which are real and which are artifices. Early adaptors think the contest is over, but at this point direct evidence is rather scant. While certainly not a DMDEI expert, I can summarize a skeptical viewpoint.

    Dark matter. Under GR, the mass we see is insufficient for the observed dynamics of galaxies and galaxy clusters. The simplest explanation is undetectable massive particles that make things add up. Other horses in the race: corrections to GR that don’t show up at solar system scale, lots of mini black holes and/or Jupiters floating about, etc. Evidence is growing that DM actually exists; gravitational lensing analysis of background galaxies seem to show DM outrunning the light matter in cluster collisions. Odds for DM: Pretty good.

    Dark energy. Measuring change in the rate of expansion of the universe is hardly Lab 101. A long chain of models, calculations, and simplifications (among the more suspect: “wouldn’t it be nice if all the bright supernovas were the same”) suggests a total rethinking of Cosmology. On one hand, there does seem to be a fairly robust signal. On the other hand, astronomy.com reports a biggest yet hypernova, gamma ray burst, whatever, about once a month, so the key “uniform supernovas” ansatz might prove to be wishful thinking. Odds for DE: maybe 50/50.

    Inflation. Integrating the differential equations of cosmic evolution backwards from (t=today, what you see) to (t=0, whatever) runs into problems. Likewise if you integrate forward from (t=0, some guess) to (t=today, what you see). Not a surprise given that GR is way singular at infinite density, and no one has any idea about the physics in this parameter range. Anyway, this integration uncovers the “smoothness problem” (the universe today appears too smooth for certain guesses about the smoothness near t=0), and others. When a DE encounters problems at singularities, one looks for an ad hoc band aid to patch things up. This works fine for shock waves, etc. The inflation phase transition allows one to start an integration slightly closer to t= 0, and finesse the smoothness gap. Odds that inflation actually happened: anyone’s guess; sounds like wishful thinking.

    Postscript. While enjoying a few beers with David Schramm about 20 years ago, I asked if he actually believed in inflation, or just enjoyed the theorizing. This lead to a discussion of how easy it is to believe in whatever you are working on. He said he thought inflation was more likely than not.

  41. Jonathan Vos Post Says:

    I don’t think we should all wait in idleness for JDEM ( Joint Dark Energy Mission), which is one of three Einstein Probe missions spelled out in NASA’s Beyond Einstein Roadmap, with a targeted total life cycle cost cap of ~$600M (FY08$) excluding launch vehicle cost. This cost cap includes NASA, DOE, and any other domestic or international contributions.

    Following the Announcement of Opportunity in late 2008, the science investigation will be selected in 2009, with a project start that same year. The launch of JDEM is planned for 2014-2015, and a nominal science operations lifetime of three years is assumed, although not required.

    I confirmed when he phoned me yesterday (on an unrelated matter) that Michael Salamon NASA HQ/ SMD / Astrophysics is in charge of the mission from the NASA side, but, given his being heard recently on National Public Radio and quoted in the Science section of the New York Times, I’d prefer not to say anything which could be politically uncomfortable for him and his colleagues.

    Until that data pours in, let a thousand theoretical flowers bloom.

  42. Scott Says:

    Raoul, IANAAP, but my understanding is that the cases for both dark matter and dark energy have improved enormously in recent years. For dark matter, there’s the bullet cluster, which rules out pretty much any explanation based on a modification to GR. For dark energy, there’s not only the supernova observations (which themselves keep getting better) but also the results from WMAP, which complement and support them.

    Personally, I’ve never really understood what induces people to stay on the fence about these things even after evidence has piled up for a decade or more. Why shouldn’t there be another type of matter, or a vacuum energy? Both not only fit the data, but are perfectly compatible with everything else we know about the laws of physics (unlike many alternative ideas that people seem willing to entertain).

  43. Scott Says:

    Jonathan, please try to resist the urge to reference, in every single one of your comments, some conversation you had with Feynman or another important or prestigious person. It is really getting on my nerves. Thanks!

  44. Michael Bacon Says:

    Boy Scott, that last comment seems to have put the brakes on the discussion. Let’s try something different.

    John,

    You said that “[i]nterpretations of quantum mechanics vary hugely in their practical simulation efficiency . . .” Could you please explain in simple terms 🙂 what you’re referring to? Thanks.

  45. John Sidles Says:

    Hi Michael! Yeah, I was just now communicating—via ouija board—with the spirits of Dirac and von Neumann on that precise topic! 🙂

    If the same question were asked about fluid mechanics then it would be reasonably clear how to answer. Everyone knows that fluids are fundamentally made out of atoms … but it is not practical to track every atom … and so some form of model order reduction is required … a plethora of techniques have been developed to achieve this … these techniques include PDEs (like Navier-Stokes), finite element methods, cellular automata … the list is immensely long … the ingenuity of fluid dynamical researchers is unbounded … and the resulting creative opportunities in math, science, and engineering are ever-expanding.

    So we can ask, how is quantum mechanics different than fluid mechanics? Here we may hope to elicit some enjoyable differences of opinion! 🙂

    To keep it short, here are three comparative assertions about fluid-versus-quantum mechanical systems … and I will argue that all three can be plausibly rejected.

    RESOLVED: Fluid systems cannot carry out computations, but quantum systems can.

    Here I would argue the negative, on the heuristic grounds that proving that turbulent Navier-Stokes flows are not carrying out (encoded) computations would be a big step toward winning the third Clay Millenium Prize! It is true that most fluid systems are not carrying out computations … and precisely the same is true of most quantum systems.

    RESOLVED: Fluid systems have a natural reduced-order state-space (namely, finite fluid elements), but quantum systems don’t.

    Here I would again argue the negative. The practical experience of the larger science and technology community is that multilinear algebraic varieties work provide a natural reduced-order state-space for quantum systems (and I refer folks to the highly readable articles of Gregory Beylkin and Martin Mohlenkamp on this subject). Pretty much every quantum research discipline since the 1920s has used multilinear varieties as a preferred state-space, but it seems that no two communities call them by the same names.

    RESOLVED: Generic fluid mechanical systems can be simulated in polynomial space and time, but generic quantum mechanical systems can’t.

    Here for the third time I would argue the negative … provided “generic” is understood to mean “noisy and/or open.” Based on the preceding two principles, the key to practical quantum simulation efficiency evidently is to choose a noise (and/or measurement) unravelling that allows the simulation to use compressed state-representations (like the above algebraic varieties) as the fundamental computational elements.

    So the direct answer to your question, Michael, is that with the proper choice of formalism, quantum systems are generically easy to simulate (meaning, not particularly harder than fluid systems … or 3D+time general relativity … which in practice means “mighty tough” to simulate).

    My main regret in the above assertions is that the resulting overall picture is wholly consistent with the quantum orthodoxy of textbooks like (e.g.) Nielsen and Chuang.

    But fortunately there are plenty of non-orthodox questions left-over to speculate about. For example, we know that atoms are the building-blocks of fluid mechanics … but do we really know that (linear) Hilbert spaces are the building-blocks of quantum mechanics?

  46. Jonathan Vos Post Says:

    Re: #43, #44

    Scott is correct, both as a general observation and as a blog-moderation policy. I do tend to slip into what others see as name-dropping. It does not matter that I see it as attempting to pass on the previous unwritten oral history of an unlikely but actual number of great minds that I have had the chutzpah to have gotten to know personally. I apologize.

    Re: #45

    John Sidles asks a fascinating question. I have little to contribute, because I am still working through Terry Tao’s blog explanations of why Navier-Stokes is so gosh-darned hard.

    Except maybe to comment that all turbulent fluids ARE performing computations, by the unproven inclusive definition of Stephen Wolfram in A New Kind of Science.

    The Pentagon way would be to have a “fly off” between a billion dollar fluidic computer and a billion-dollar quantum computer, and award an astronomically large contract to the contractor team who built the winning demo.

    Also, to note that quantum liquids don’t seem to have a Navier-Stokes problem. See the overview by A. J. Leggett, Science, 319(29 Feb 2008)1203-4.

  47. mitchell porter Says:

    Greg and I had a discussion about noncommutative probability once before, and I think Robin Hanson’s second paragraph, in #31 above, is an excellent statement of the problems with the idea.