When modular arithmetic was a STOC result

So, it seems the arXiv is now so popular that even Leonhard Euler has contributed 25 papers, despite being dead since 1783. (Thanks to Ars Mathematica for this important news item, as well as for the hours of procrastination on my part that led to its rediscovery.) Since I’d long been curious about the mathematical research interests of the nonliving, I decided to check out Leonhard’s most recent preprint, math.HO/0608467 (“Theorems on residues obtained by the division of powers”). The paper starts out slow: explaining in detail why, if a mod p is nonzero, then a2 mod p, a3 mod p, and so on are also nonzero. By the end, though, it’s worked out most of the basics of modular arithmetic, enough (for example) to analyze RSA. Furthermore, the exposition, while “retro” in style, is sufficiently elegant that I might even recommend acceptance at a minor theory conference, even though the basic results have of course been known for like 200 years.

Oh — you say that Mr. E’s papers were as difficult and abstract for their time as Wiles and Perelman’s papers are for our own time? BULLSHIT. Reading the old master brings home the truth: that, for better and worse, math has gotten harder. Much, much harder. And we haven’t gotten any smarter.

34 Responses to “When modular arithmetic was a STOC result”

  1. Luca Says:

    I was expecting to see 16 reasons why I should believe that math has gotten harder.

  2. Scott Says:

    Where do you get 16? Obviously the number of reasons has to be a sum of two positive squares.

    (Unless the number’s zero, which makes for the best blog posts of all.)

  3. Anonymous Says:

    We haven’t gotten smarter, but we now have the shoulders of giants to ride on, that Euler didn’t have the benefit of riding on (or something.)

  4. Brian Says:

    Math has gotten harder. Much, much harder. And we haven’t gotten any smarter.

    Perhaps, but there are a lot more of us nowadays.

    Still, you raise an interesting point: if we accept as given that (1) math research is constantly getting harder, and (2) human population can’t scale along with that hardness, then the rate of progress in mathematical research must eventually slow to a near halt.

    Maybe computers can help.

  5. Anonymous Says:

    Scott, Scott…you are missing an important point. By choosing to publish only in ARXIV, Mr. Euler and Mr. Perelman have given the finger to silly conferences like STOC and silly refereed journals. They have gone directly to the heart of the matter, ARXIV: a path that achieves their goal perfectly with the greatest economy of resources; the purest, simplest, sufficient path. Great minds abhor the superfluous pomposity of STOC.

  6. Scott Says:

    You’ve got a point. Come to think of it, g^(r+s)=g^r g^s (mod p) is one of the less trivial results to appear in an arXiv preprint…

  7. Anonymous Says:


    … then the rate of progress in mathematical research must eventually slow to a near halt.

    The problem is more that the increase in life expectance of human beings doesn’t scale along with the hardness of mathematics.

  8. Greg Kuperberg Says:

    Scott is surely correct that math has gotten a lot harder. It would be depressing if it hadn’t — it would mean that society wasn’t accumulating mathematical ideas and moving on. Indeed, it was extremely depressing when mathematics did get easier in certain historical periods, for example when Greek mathematics was forgotten in the Middle Ages in Europe.

    But it is easy to take Scott’s observation too far, as some people here have done. Any individual mathematics paper is only as hard as society wants it to be. The real question is not that, but how well mathematicians can cooperate to produce more mathematics.

    Two remarks on that. First, human society is still extremely far away from full cooperation in the production of mathematics. Things have gotten dramatically better in the past 50 years, but I imagine that there is still at least a factor of 100 or 1000 left to go, even without counting the use of computers to produce mathematics. A Tao or a Perelman is much more likely to be found and properly educated now than in 1950; and once educated, he or she can communicate and visit with other mathematicians much more easily. There are many more math departments in the United States doing good research now than in 1950.

    (Also, let’s not get distracted by the Tao and Perelman crowd. Mathematicians who aren’t otherwordly but who are still bright and good — the Aaronsons — are also more likely to be found and educated.)

    Second, there is some confusion in this discussion between first and second derivatives, or between either and logarithmic derivatives. Let X(t) be the total mathematics produced by society up to time t. Then which quantity is bound decrease because mathematics has gotten harder? Is it dX/dt, d^2X/dt^2, or (dX/dt)/X? If it’s only that last one, that isn’t necessarily so bad.

  9. Anonymous Says:

    Also, let’s not get distracted by the Tao and Perelman crowd. Mathematicians who aren’t otherwordly but who are still bright and good — the Aaronsons

    Ouch! 😉 Scott you’ve got to hurry up and get a Fields medal quick to show that you are in fact otherworldly! 🙂

  10. Scott Says:

    Oh! I thought Greg was talking about Jon Aaronson of Tel Aviv University.

  11. Greg Kuperberg Says:

    Yeah, get cracking Scott. You only have three or four more chances at either the Fields Medal or the Nevanlinna Prize.

    Since you do QC, you may (until you reach 40) still have a chance at a Grand Slam: the Fields, the Turing Award, and the Nobel Prize in Physics. All you have to do is design the first working quantum computer using a revolutionary new theorem of quantum fault tolerance.

    It may also help if you joined forces with Jon Aaronson and claimed to be the same person. Are there any other Aaronsons available?

  12. Scott Says:

    Well, there’s Scott Aaronson, M.D., a breast implant surgeon in Palm Springs (you can see some before-and-after photos on his website). But I don’t think he could help me win a Fields, except possibly by distracting the competition.

    And what does it matter? Being called “bright and good” by Greg Kuperberg is a far greater honor than the Fields, Nobel, and Nevanlinna combined.

  13. Greg Kuperberg Says:

    Woo, woo! I didn’t know that I held such authority.

    I regret to announce that I am still considering the files of Tao, Perelman, and all other contenders from that crowd. (Except for Peter Shor. He rocks!) Applications may be sent to…nah, don’t bother. If you’re good, I will know who you are.

  14. Anonymous Says:

    The ArXiv is a great source of inspiration for all of us, with two proofs that P=NP and one that P!=NP just in the last month and a half!

  15. Anonymous Says:

    Why the 16 from Luca? Since you had 10 reasons for … followed by 13 reasons for … — didn’t you know that Luca is studying 3-term arithmetic progressions these days?

    aravind

  16. elad verbin Says:

    The problem is more that the increase in life expectance of human beings doesn’t scale along with the hardness of mathematics.

    So, essentially you’re asking if mathematics research is parallelizable… Because population increases exponentially (though not for long) while life expectancy increases sub-linearly

    Speaking of life expectancy and population size, do everyone know a scifi story named Melancholy Elephants by Spider Robinson, which is, in my eyes, one of the best scifi stories ever written (along with Orson Scott Card’s “Ender’s Game” and “The Worthing Saga”, for example). Read it: here .

  17. Scott Says:

    Why the 16 from Luca? Since you had 10 reasons for … followed by 13 reasons for …

    10 is sum of two positive squares, 13 is a sum of two positive squares, 16 is not. It completely breaks the pattern.

    didn’t you know that Luca is studying 3-term arithmetic progressions these days?

    I thought he was securing the cyberspace.

  18. John Sidles Says:

    Scott sez: … for better and worse, math has gotten harder. Much, much harder. And we haven’t gotten any smarter.

    On the evidence, Scott is absolutely right! For enjoyment, I reviewed all Physical Review articles from 1937-39, and determined that to do cutting-edge research in quantum measurement theory, John von Neumann had to read only one or two articles per month (although these were pretty considerable articles, by the likes of van Vleck, Bohr, Dirac, etc.).

    In contrast, to stay current nowadays with the quantum information literature, a person has to read 5-10 articles per day. So over three scientific generations, the task of staying current has become on the order of 100X harder. And there is every prospect that in the next three scientific generations, the flow of literature will increase by another 100X.

    So how should today’s professor answer an impertinent student (the student in the opening lecture of Young Frankenstein) who says:

    Isn’t it true, professor, that the agnotology of modern mathematics resembles the agnotology of modern biology? Because, just as biologists must know the genetic code GCAT, mathematicians must know the integers. And just as biologists must understand the regulation of gene expression, mathematicians must know the real and complex numbers (but neither will ever master their subject).

    And just as individual biologists nowadays have no hope of understanding all genes—on purely informatic grounds; there being too many of them—isn’t it true that individual mathematicians have no hope of understanding all complexity classes, or topological classes, or any classes in general?

    And therefore, isn’t it true that mathematics is destined, by virtue of its complexity, to become an anthropic discipline, like anthropology, molecular biology, and string theory?

    IMHO, what mathematics will become in this century will be collectively determined by the answers embraced by precisely such impertinent students.

  19. GASARCH Says:

    Great quote on this matter from the movie
    OH GOD, II by GOD (played by George Burns)

    Math was a mistake, I made it too hard.

  20. HN Says:

    I personally don’t think that Galois theory is “easier” than the PCP theorem. How many years have gone since Gauss’ time, and people still get Fields’ medals (partially) for results in linear algebra (Horn conjecture, e.g.).

    Perhaps Math seems so hard these days because the number of irrelevant questions are exponentially more than those in Euler’s era. (Scott, that’s an idea for your next post: top 18 irrelevant problems in TCS that people kept publishing on!)

    If computation was recognized as a fundamental concept in his time, Euler might have shown that “Prime is in P” (with all due respect to AKS).

  21. Scott Says:

    I personally don’t think that Galois theory is “easier” than the PCP theorem.

    Galois is about the exact point where things start to get hard…

  22. Scott Says:

    In contrast, to stay current nowadays with the quantum information literature, a person has to read 5-10 articles per day.

    Uh-oh!

    (Does it count if I scan 5-10 paper titles per day?)

  23. Michi Says:

    You know, reading Euler is lovely – but reading him in the original language is absolutely marvelous. When looking around for a B.Sc. thesis subject, one of my professors encouraged me to go sit down with John Wallis and the good old Euler in original latin and see what I could do with it. Wallis was horrible and unreadable. Euler was almost as easily read as the Vulgata bible – viz. keeping up a reading speed not quite comparable to my native language, but almost.

    I’m certain the translations are good – but if you have the latin to support it, go read it in the original!

  24. Anonymous Says:

    I personally don’t think that Galois theory is “easier” than the PCP theorem.

    Galois theory at the level that Galois developed it is fantastically clever and beautiful, but not in the least bit “hard” in the sense that technically it is significantly less demanding than, say, learning enough analysis to do Riemann integration.


    Galois is about the exact point where things start to get hard…

    I think Gauss would beg to differ (and for that matter, so would I).

    p.s. I am not Gauss.

  25. Scott Says:

    I’m certain the translations are good – but if you have the latin to support it, go read it in the original!

    Thatus iis est ideae greatus. Latina Aaronsonus summa cum laude. E pluribus unum!

  26. Anonymous Says:

    If computation was recognized as a fundamental concept in his time, Euler might have shown that “Prime is in P” (with all due respect to AKS).

    Huh? Maybe he did show that “Primes is in P”. IIRC, he was the most prolific mathematician who ever lived and they were still publishing his works seventy years after he died (or so my diff eq textbook says which could be wrong). So maybe there is a “Primes is in P” proof in one of the many volumes of his work and nobody knows because nobody has looked.

  27. Simone Severini Says:

    “Thatus iis est ideae greatus. Latina Aaronsonus summa cum laude. E pluribus unum!”

    Aaronsonus Gigas: Large animal which populated Umbria in ancient times.

    “Aaronsonus edit solus ranas.” From “Naturalis Historia” by Pliny the Elder.

  28. Anonymous Says:

    “I personally don’t think that Galois theory is “easier” than the PCP theorem.”

    You are absolutely right. But it just shows that the PCP Theorem is not that complicated. If you
    compare Galois theory with current math done in fields like Number Theory for instance, then
    Galois is evidently much easier technically.

  29. Alejandro Says:

    Just imagine, if you wrote your papers in Latin the author would be would be Caledonius Filius Aaronis!

  30. ObsessiveMathsFreak Says:

    Reading the old master brings home the truth: that, for better and worse, math has gotten harder. Much, much harder. And we haven’t gotten any smarter.

    Mathematics has not gotten harder. Mathematicians have simply become worse at expressing their own mathematics.

  31. Anonymous Says:

    If Galois theory was easier to come up with and develop at the time of Galois than what Perelman did recently was now, then why didn’t someone other than Galois come up with it earlier (or indeed after, as Galois’ work languished in some obscure cellar without being read for many years)?

    You could argue that progress in math is slowing down (which I don’t believe), but to argue that what mathematicians did back then was in any sense easier is almost tautologically false. They, like we, used their abilities to the extent they could. What they came up with was the best they could do. In that sense there was nothing easier about it than what we do now.

    Personally I think Galois’ work was much more difficult, in some sense, than Perelman’s. What Perelman did someone else — Yao, Hamilton, some obscure dude — would have done eventually, and maybe it wouldn’t even have been so long. In the case of Galois we have positive proof that nobody else did come up with the same thing even many years afterwards.

  32. Greg Kuperberg Says:

    Galois’ time was different because the great system of meritocracy in higher education was much less developed then and partly just plain broken. No one else had the same ideas because so many other bright, would-be mathematicians never learned how to read, even in Western Europe. Or if they were lucky and eventually made it to a university, they may well have been ignored even more completely than Galois was.

    As for Perelman, according to Nasar and also rumors, he was already hot on the trail some seven years ago. He had already had some of the beautiful ideas which are crucial for everything that he did. But while he worked in seclusion, no one else found even the first of his ideas. No one else even got to square two. So you can make the same argument for Perelman as for Galois. If Perelman’s ideas only waited seven years instead of decades as with Galois, there are many more people working now in 3-manifold topology than there were then working on polynomial equations.

    Sure, eventually someone probably would have found what Perelman found. But it would probably not have been Hamilton himself. That is, Hamilton did many, many great things with this program, but he had been stuck for a while and life is finite. Some of Perelman’s analyzers have said that it might have taken a century. That could be an exaggeration, but certainly it might have taken decades.

  33. Greg Kuperberg Says:

    Also, I don’t think it’s right to mark Galois theory as the transition to when mathematics got “hard”. The first really challenging tract of mathematics is the Principia — that is one reason that people admire it. If you don’t believe me, try proving directly from the 1/r^2 force law that two-body orbits are closed ellipses. If you have learned/memorized the solution, then sure, you can regurgitate it, but otherwise it’s not so easy.

    If the question is when people as distinguished as Euler lost the privilege of publishing baby papers, the answer is that they never did. Even today, first-rate mathematicians write for the Monthly. That was just part of Euler’s style. It was also Cauchy’s style, sometimes. It’s a perfectly honorable way to write papers, provided that you write with discretion and taste. On the other hand, Gauss (who predated Galois) was almost as severe as Perelman, and that’s fine too.

    Again, if you think that things first got hard with Galois, try proving Gauss’ quadratic reciprocity theorem on your own.

  34. Anonymous Says:

    Greg,

    Interesting point (I’m talking about your first response now). In reply to there being more mathematicians with higher schooling now, I would say that there are also more directions to go in, more open problems, more ripe research fields. People like Galois (or even more extreme, someone like Euclid) were basically in the dark, having to invent math out of thin air.

    I can buy the “technical hardness” bit, assuming it be defined with some reasonable precision, but that’s far from the only relevant parameter. Take something like cryptography. Diffie-Hellman’s scheme could have been invented, more or less, at any time over a period of several hundred years (probably longer). There’s nothing technically hard about it, and cryptography has been interesting to the elite of societies for a long time. Yet it wasn’t. Why? I don’t know, but surely there is some additional hardness constraint here — the idea itself — apart from the technical bit. It is in that sense I feel (without, admittedly, understanding a damn thing about Perelman’s proof) that what Galois did might in some sense have been harder than what Perelman has done.

    But I can see the other side too.