How is it possible that mistakes occur in mathematics?
René Descartes's Method was so clear, he said, a mistake could only happen by inadvertence. Yet, ... his Géométrie contains conceptual mistakes about three-dimensional space.
Henri Poincaré said it was strange that mistakes happen in mathematics, since mathematics is just sound reasoning, such as anyone in his right mind follows. His explanation was memory lapse—there are only so many things we can keep in mind at once.
Wittgenstein said that mathematics could be characterized as the subject where it's possible to make mistakes. (Actually, it's not just possible, it's inevitable.) The very notion of a mistake presupposes that there is right and wrong independent of what we think, which is what makes mathematics mathematics. We mathematicians make mistakes, even important ones, even in famous papers that have been around for years.
Philip J. Davis displays an imposing collection of errors, with some famous names. His article shows that mistakes aren't uncommon. It shows that mathematical knowledge is fallible, like other knowledge.
Some mistakes come from keeping old assumptions in a new context.
Infinite dimensionl space is just like finite dimensional space—except for one or two properties, which are entirely different.
Riemann stated and used what he called "Dirichlet's principle" incorrectly [when trying to prove his mapping theorem].
Julius König and David Hilbert each thought he had proven the continuum hypothesis. (Decades later, it was proved undecidable by Kurt Gödel and Paul Cohen.)
Sometimes mathematicians try to give a complete classification of an object of interest. It's a mistake to claim a complete classification while leaving out several cases. That's what happened, first to Descartes, then to Newton, in their attempts to classify cubic curves (Boyer). [cf. this annotation by Peter Shor.]
Is a gap in a proof a mistake? Newton found the speed of a falling stone by dividing 0/0. Berkeley called him to account for bad algebra, but admitted Newton had the right answer... Mistake or not?
"The mistakes of a great mathematician are worth more than the correctness of a mediocrity." I've heard those words more than once. Explicating this thought would tell something about the nature of mathematics. For most academic philosopher of mathematics, this remark has nothing to do with mathematics or the philosophy of mathematics. Mathematics for them is indubitable—rigorous deductions from premises. If you made a mistake, your deduction wasn't rigorous, By definition, then, it wasn't mathematics!
So the brilliant, fruitful mistakes of Newton, Euler, and Riemann, weren't mathematics, and needn't be considered by the philosopher of mathematics.
Riemann's incorrect statement of Dirichlet's principle was corrected, implemented, and flowered into the calculus of variations. On the other hand, thousands of correct theorems are published every week. Most lead nowhere.
A famous oversight of Euclid and his students (don't call it a mistake) was neglecting the relation of "between-ness" of points on a line. This relation was used implicitly by Euclid in 300 B.C. It was recognized explicitly by Moritz Pasch over 2,000 years later, in 1882. For two millennia, mathematicians and philosophers accepted reasoning that they later rejected.
Can we be sure that we, unlike our predecessors, are not overlooking big gaps? We can't. Our mathematics can't be certain.
The reference to the said article by Philip J. Davis is:
From that article, I find particularly amusing the following couple of paragraphs from page 262:
There is a book entitled Erreurs de Mathématiciens, published by Maurice Lecat in 1935 in Brussels. This book contains more than 130 pages of errors committed by mathematicians of the first and second rank from antiquity to about 1900.There are parallel columns listing the mathematician, the place where his error occurs, the man who discovers the error, and the place where the error is discussed. For example, J. J. Sylvester committed an error in "On the Relation between the Minor Determinant of Linearly Equivalent Quadratic Factors", Philos. Mag., (1851) pp. 295-305. This error was corrected by H. E. Baker in the Collected Papers of Sylvester, Vol. I, pp. 647-650.
A mathematical error of international significance may occur every twenty years or so. By this I mean the conjunction a mathematician of great reputation and a problem of great notoriety. Such a conjunction occurred around 1945 when H. Rademacher thought he had solved the Riemann Hypothesis. There was a report in Time magazine.