When modular arithmetic was a STOC result
So, it seems the arXiv is now so popular that even Leonhard Euler has contributed 25 papers, despite being dead since 1783. (Thanks to Ars Mathematica for this important news item, as well as for the hours of procrastination on my part that led to its rediscovery.) Since I’d long been curious about the mathematical research interests of the nonliving, I decided to check out Leonhard’s most recent preprint, math.HO/0608467 (“Theorems on residues obtained by the division of powers”). The paper starts out slow: explaining in detail why, if a mod p is nonzero, then a2 mod p, a3 mod p, and so on are also nonzero. By the end, though, it’s worked out most of the basics of modular arithmetic, enough (for example) to analyze RSA. Furthermore, the exposition, while “retro” in style, is sufficiently elegant that I might even recommend acceptance at a minor theory conference, even though the basic results have of course been known for like 200 years.
Oh — you say that Mr. E’s papers were as difficult and abstract for their time as Wiles and Perelman’s papers are for our own time? BULLSHIT. Reading the old master brings home the truth: that, for better and worse, math has gotten harder. Much, much harder. And we haven’t gotten any smarter.
Comment #1 September 10th, 2006 at 5:54 am
I was expecting to see 16 reasons why I should believe that math has gotten harder.
Comment #2 September 10th, 2006 at 6:17 am
Where do you get 16? Obviously the number of reasons has to be a sum of two positive squares.
(Unless the number’s zero, which makes for the best blog posts of all.)
Comment #3 September 10th, 2006 at 10:34 am
We haven’t gotten smarter, but we now have the shoulders of giants to ride on, that Euler didn’t have the benefit of riding on (or something.)
Comment #4 September 10th, 2006 at 11:17 am
Math has gotten harder. Much, much harder. And we haven’t gotten any smarter.
Perhaps, but there are a lot more of us nowadays.
Still, you raise an interesting point: if we accept as given that (1) math research is constantly getting harder, and (2) human population can’t scale along with that hardness, then the rate of progress in mathematical research must eventually slow to a near halt.
Maybe computers can help.
Comment #5 September 10th, 2006 at 11:49 am
Scott, Scott…you are missing an important point. By choosing to publish only in ARXIV, Mr. Euler and Mr. Perelman have given the finger to silly conferences like STOC and silly refereed journals. They have gone directly to the heart of the matter, ARXIV: a path that achieves their goal perfectly with the greatest economy of resources; the purest, simplest, sufficient path. Great minds abhor the superfluous pomposity of STOC.
Comment #6 September 10th, 2006 at 12:30 pm
You’ve got a point. Come to think of it, g^(r+s)=g^r g^s (mod p) is one of the less trivial results to appear in an arXiv preprint…
Comment #7 September 10th, 2006 at 12:33 pm
… then the rate of progress in mathematical research must eventually slow to a near halt.
The problem is more that the increase in life expectance of human beings doesn’t scale along with the hardness of mathematics.
Comment #8 September 10th, 2006 at 2:32 pm
Scott is surely correct that math has gotten a lot harder. It would be depressing if it hadn’t — it would mean that society wasn’t accumulating mathematical ideas and moving on. Indeed, it was extremely depressing when mathematics did get easier in certain historical periods, for example when Greek mathematics was forgotten in the Middle Ages in Europe.
But it is easy to take Scott’s observation too far, as some people here have done. Any individual mathematics paper is only as hard as society wants it to be. The real question is not that, but how well mathematicians can cooperate to produce more mathematics.
Two remarks on that. First, human society is still extremely far away from full cooperation in the production of mathematics. Things have gotten dramatically better in the past 50 years, but I imagine that there is still at least a factor of 100 or 1000 left to go, even without counting the use of computers to produce mathematics. A Tao or a Perelman is much more likely to be found and properly educated now than in 1950; and once educated, he or she can communicate and visit with other mathematicians much more easily. There are many more math departments in the United States doing good research now than in 1950.
(Also, let’s not get distracted by the Tao and Perelman crowd. Mathematicians who aren’t otherwordly but who are still bright and good — the Aaronsons — are also more likely to be found and educated.)
Second, there is some confusion in this discussion between first and second derivatives, or between either and logarithmic derivatives. Let X(t) be the total mathematics produced by society up to time t. Then which quantity is bound decrease because mathematics has gotten harder? Is it dX/dt, d^2X/dt^2, or (dX/dt)/X? If it’s only that last one, that isn’t necessarily so bad.
Comment #9 September 11th, 2006 at 12:38 am
Also, let’s not get distracted by the Tao and Perelman crowd. Mathematicians who aren’t otherwordly but who are still bright and good — the Aaronsons
Ouch! 😉 Scott you’ve got to hurry up and get a Fields medal quick to show that you are in fact otherworldly! 🙂
Comment #10 September 11th, 2006 at 2:01 am
Oh! I thought Greg was talking about Jon Aaronson of Tel Aviv University.
Comment #11 September 11th, 2006 at 2:12 am
Yeah, get cracking Scott. You only have three or four more chances at either the Fields Medal or the Nevanlinna Prize.
Since you do QC, you may (until you reach 40) still have a chance at a Grand Slam: the Fields, the Turing Award, and the Nobel Prize in Physics. All you have to do is design the first working quantum computer using a revolutionary new theorem of quantum fault tolerance.
It may also help if you joined forces with Jon Aaronson and claimed to be the same person. Are there any other Aaronsons available?
Comment #12 September 11th, 2006 at 2:34 am
Well, there’s Scott Aaronson, M.D., a breast implant surgeon in Palm Springs (you can see some before-and-after photos on his website). But I don’t think he could help me win a Fields, except possibly by distracting the competition.
And what does it matter? Being called “bright and good” by Greg Kuperberg is a far greater honor than the Fields, Nobel, and Nevanlinna combined.
Comment #13 September 11th, 2006 at 2:56 am
Woo, woo! I didn’t know that I held such authority.
I regret to announce that I am still considering the files of Tao, Perelman, and all other contenders from that crowd. (Except for Peter Shor. He rocks!) Applications may be sent to…nah, don’t bother. If you’re good, I will know who you are.
Comment #14 September 11th, 2006 at 5:14 am
The ArXiv is a great source of inspiration for all of us, with two proofs that P=NP and one that P!=NP just in the last month and a half!
Comment #15 September 11th, 2006 at 11:12 am
Why the 16 from Luca? Since you had 10 reasons for … followed by 13 reasons for … — didn’t you know that Luca is studying 3-term arithmetic progressions these days?
aravind
Comment #16 September 11th, 2006 at 1:33 pm
The problem is more that the increase in life expectance of human beings doesn’t scale along with the hardness of mathematics.
So, essentially you’re asking if mathematics research is parallelizable… Because population increases exponentially (though not for long) while life expectancy increases sub-linearly
Speaking of life expectancy and population size, do everyone know a scifi story named Melancholy Elephants by Spider Robinson, which is, in my eyes, one of the best scifi stories ever written (along with Orson Scott Card’s “Ender’s Game” and “The Worthing Saga”, for example). Read it: here .
Comment #17 September 11th, 2006 at 1:50 pm
Why the 16 from Luca? Since you had 10 reasons for … followed by 13 reasons for …
10 is sum of two positive squares, 13 is a sum of two positive squares, 16 is not. It completely breaks the pattern.
didn’t you know that Luca is studying 3-term arithmetic progressions these days?
I thought he was securing the cyberspace.
Comment #18 September 11th, 2006 at 2:13 pm
Scott sez: … for better and worse, math has gotten harder. Much, much harder. And we haven’t gotten any smarter.
On the evidence, Scott is absolutely right! For enjoyment, I reviewed all Physical Review articles from 1937-39, and determined that to do cutting-edge research in quantum measurement theory, John von Neumann had to read only one or two articles per month (although these were pretty considerable articles, by the likes of van Vleck, Bohr, Dirac, etc.).
In contrast, to stay current nowadays with the quantum information literature, a person has to read 5-10 articles per day. So over three scientific generations, the task of staying current has become on the order of 100X harder. And there is every prospect that in the next three scientific generations, the flow of literature will increase by another 100X.
So how should today’s professor answer an impertinent student (the student in the opening lecture of Young Frankenstein) who says:
Isn’t it true, professor, that the agnotology of modern mathematics resembles the agnotology of modern biology? Because, just as biologists must know the genetic code GCAT, mathematicians must know the integers. And just as biologists must understand the regulation of gene expression, mathematicians must know the real and complex numbers (but neither will ever master their subject).
And just as individual biologists nowadays have no hope of understanding all genes—on purely informatic grounds; there being too many of them—isn’t it true that individual mathematicians have no hope of understanding all complexity classes, or topological classes, or any classes in general?
And therefore, isn’t it true that mathematics is destined, by virtue of its complexity, to become an anthropic discipline, like anthropology, molecular biology, and string theory?
IMHO, what mathematics will become in this century will be collectively determined by the answers embraced by precisely such impertinent students.
Comment #19 September 11th, 2006 at 3:18 pm
Great quote on this matter from the movie
OH GOD, II by GOD (played by George Burns)
Math was a mistake, I made it too hard.
Comment #20 September 11th, 2006 at 4:55 pm
I personally don’t think that Galois theory is “easier” than the PCP theorem. How many years have gone since Gauss’ time, and people still get Fields’ medals (partially) for results in linear algebra (Horn conjecture, e.g.).
Perhaps Math seems so hard these days because the number of irrelevant questions are exponentially more than those in Euler’s era. (Scott, that’s an idea for your next post: top 18 irrelevant problems in TCS that people kept publishing on!)
If computation was recognized as a fundamental concept in his time, Euler might have shown that “Prime is in P” (with all due respect to AKS).
Comment #21 September 11th, 2006 at 5:21 pm
I personally don’t think that Galois theory is “easier” than the PCP theorem.
Galois is about the exact point where things start to get hard…
Comment #22 September 11th, 2006 at 5:23 pm
In contrast, to stay current nowadays with the quantum information literature, a person has to read 5-10 articles per day.
Uh-oh!
(Does it count if I scan 5-10 paper titles per day?)
Comment #23 September 11th, 2006 at 5:35 pm
You know, reading Euler is lovely – but reading him in the original language is absolutely marvelous. When looking around for a B.Sc. thesis subject, one of my professors encouraged me to go sit down with John Wallis and the good old Euler in original latin and see what I could do with it. Wallis was horrible and unreadable. Euler was almost as easily read as the Vulgata bible – viz. keeping up a reading speed not quite comparable to my native language, but almost.
I’m certain the translations are good – but if you have the latin to support it, go read it in the original!
Comment #24 September 11th, 2006 at 5:54 pm
I personally don’t think that Galois theory is “easier” than the PCP theorem.
Galois theory at the level that Galois developed it is fantastically clever and beautiful, but not in the least bit “hard” in the sense that technically it is significantly less demanding than, say, learning enough analysis to do Riemann integration.
Galois is about the exact point where things start to get hard…
I think Gauss would beg to differ (and for that matter, so would I).
p.s. I am not Gauss.
Comment #25 September 11th, 2006 at 6:17 pm
I’m certain the translations are good – but if you have the latin to support it, go read it in the original!
Thatus iis est ideae greatus. Latina Aaronsonus summa cum laude. E pluribus unum!
Comment #26 September 11th, 2006 at 6:29 pm
If computation was recognized as a fundamental concept in his time, Euler might have shown that “Prime is in P” (with all due respect to AKS).
Huh? Maybe he did show that “Primes is in P”. IIRC, he was the most prolific mathematician who ever lived and they were still publishing his works seventy years after he died (or so my diff eq textbook says which could be wrong). So maybe there is a “Primes is in P” proof in one of the many volumes of his work and nobody knows because nobody has looked.
Comment #27 September 11th, 2006 at 6:53 pm
“Thatus iis est ideae greatus. Latina Aaronsonus summa cum laude. E pluribus unum!”
Aaronsonus Gigas: Large animal which populated Umbria in ancient times.
“Aaronsonus edit solus ranas.” From “Naturalis Historia” by Pliny the Elder.
Comment #28 September 12th, 2006 at 3:14 am
“I personally don’t think that Galois theory is “easier” than the PCP theorem.”
You are absolutely right. But it just shows that the PCP Theorem is not that complicated. If you
compare Galois theory with current math done in fields like Number Theory for instance, then
Galois is evidently much easier technically.
Comment #29 September 12th, 2006 at 1:28 pm
Just imagine, if you wrote your papers in Latin the author would be would be Caledonius Filius Aaronis!
Comment #30 September 12th, 2006 at 1:33 pm
Reading the old master brings home the truth: that, for better and worse, math has gotten harder. Much, much harder. And we haven’t gotten any smarter.
Mathematics has not gotten harder. Mathematicians have simply become worse at expressing their own mathematics.
Comment #31 September 13th, 2006 at 11:03 am
If Galois theory was easier to come up with and develop at the time of Galois than what Perelman did recently was now, then why didn’t someone other than Galois come up with it earlier (or indeed after, as Galois’ work languished in some obscure cellar without being read for many years)?
You could argue that progress in math is slowing down (which I don’t believe), but to argue that what mathematicians did back then was in any sense easier is almost tautologically false. They, like we, used their abilities to the extent they could. What they came up with was the best they could do. In that sense there was nothing easier about it than what we do now.
Personally I think Galois’ work was much more difficult, in some sense, than Perelman’s. What Perelman did someone else — Yao, Hamilton, some obscure dude — would have done eventually, and maybe it wouldn’t even have been so long. In the case of Galois we have positive proof that nobody else did come up with the same thing even many years afterwards.
Comment #32 September 13th, 2006 at 12:39 pm
Galois’ time was different because the great system of meritocracy in higher education was much less developed then and partly just plain broken. No one else had the same ideas because so many other bright, would-be mathematicians never learned how to read, even in Western Europe. Or if they were lucky and eventually made it to a university, they may well have been ignored even more completely than Galois was.
As for Perelman, according to Nasar and also rumors, he was already hot on the trail some seven years ago. He had already had some of the beautiful ideas which are crucial for everything that he did. But while he worked in seclusion, no one else found even the first of his ideas. No one else even got to square two. So you can make the same argument for Perelman as for Galois. If Perelman’s ideas only waited seven years instead of decades as with Galois, there are many more people working now in 3-manifold topology than there were then working on polynomial equations.
Sure, eventually someone probably would have found what Perelman found. But it would probably not have been Hamilton himself. That is, Hamilton did many, many great things with this program, but he had been stuck for a while and life is finite. Some of Perelman’s analyzers have said that it might have taken a century. That could be an exaggeration, but certainly it might have taken decades.
Comment #33 September 13th, 2006 at 12:53 pm
Also, I don’t think it’s right to mark Galois theory as the transition to when mathematics got “hard”. The first really challenging tract of mathematics is the Principia — that is one reason that people admire it. If you don’t believe me, try proving directly from the 1/r^2 force law that two-body orbits are closed ellipses. If you have learned/memorized the solution, then sure, you can regurgitate it, but otherwise it’s not so easy.
If the question is when people as distinguished as Euler lost the privilege of publishing baby papers, the answer is that they never did. Even today, first-rate mathematicians write for the Monthly. That was just part of Euler’s style. It was also Cauchy’s style, sometimes. It’s a perfectly honorable way to write papers, provided that you write with discretion and taste. On the other hand, Gauss (who predated Galois) was almost as severe as Perelman, and that’s fine too.
Again, if you think that things first got hard with Galois, try proving Gauss’ quadratic reciprocity theorem on your own.
Comment #34 September 13th, 2006 at 2:55 pm
Greg,
Interesting point (I’m talking about your first response now). In reply to there being more mathematicians with higher schooling now, I would say that there are also more directions to go in, more open problems, more ripe research fields. People like Galois (or even more extreme, someone like Euclid) were basically in the dark, having to invent math out of thin air.
I can buy the “technical hardness” bit, assuming it be defined with some reasonable precision, but that’s far from the only relevant parameter. Take something like cryptography. Diffie-Hellman’s scheme could have been invented, more or less, at any time over a period of several hundred years (probably longer). There’s nothing technically hard about it, and cryptography has been interesting to the elite of societies for a long time. Yet it wasn’t. Why? I don’t know, but surely there is some additional hardness constraint here — the idea itself — apart from the technical bit. It is in that sense I feel (without, admittedly, understanding a damn thing about Perelman’s proof) that what Galois did might in some sense have been harder than what Perelman has done.
But I can see the other side too.