lundi 26 octobre 2009
Paraprosdokian
"I want to die like my father, quietly, in his sleep—not screaming and terrified like his passengers." Bob Monkhouse
vendredi 11 septembre 2009
Dase's algorithm
I just saw a TED talk about Mathemagics by Arthur Benjamin, and googled a few related things. With the notable exception of Alexander Aitken, it seems that very few calculator prodigies are able to explain how they actually calculate so quickly. The fuzziness of the explanations suggests some less-rational procedure that seems closer to intuition than to any algorithm.
One such calculator prodigy was Zacharias Dase (1824-2861). He is shortly mentioned in Hofstadter’s book, Gödel-Escher-Bach. His calculation skills were such that Gauss himself was impressed and he had him hired by the Hamburg Academy of Sciences to compile some mathematical tables. Dase had other amazing abilities, like counting sheep in a flock in a single glance.
In the MacTutor History of Mathematics archive, you can read the following sentence about Dase: "He multiplied two 20 digit numbers in 6 minutes; two 40 digit numbers in 40 minutes; two 100 digit numbers in 8 hours 45 minutes."
One such calculator prodigy was Zacharias Dase (1824-2861). He is shortly mentioned in Hofstadter’s book, Gödel-Escher-Bach. His calculation skills were such that Gauss himself was impressed and he had him hired by the Hamburg Academy of Sciences to compile some mathematical tables. Dase had other amazing abilities, like counting sheep in a flock in a single glance.
In the MacTutor History of Mathematics archive, you can read the following sentence about Dase: "He multiplied two 20 digit numbers in 6 minutes; two 40 digit numbers in 40 minutes; two 100 digit numbers in 8 hours 45 minutes."
When you put these figures on a logarithmic plot it seems that Dase’s unknown algorithm was slightly slower than quadratic (solid line). It is likely that he was using the same algorithm as anyone of us: multiplying each digit of the first number by the second number, and taking the sum. He was very good at it, and he obviously had some tricks, but that was probably all of it.
Had Zacharias Dase lived lived today, he might have tried to use the FFT-like algorithm by Schönhage and Strassen (dotted line), which is faster for multiplying large numbers.
Had Zacharias Dase lived lived today, he might have tried to use the FFT-like algorithm by Schönhage and Strassen (dotted line), which is faster for multiplying large numbers.
http://www.ted.com/talks/lang/eng/arthur_benjamin_does_mathemagic.html.
samedi 22 août 2009
What is the difference between an ape and the number pi?
My message in French of December 19th had to do with tomorrow's newspaper being hidden somewhere in the number pi. Express pi in base 26 and write it down with our alphabet. You obtain an infinite series of random letters, in which any given finite sequence of letters (including e.g. Hamlet, on any text yet to be written) appears with a probability equal to 1.
A related paradox is that of an ape rewriting Hamlet by just typing randomly on a keyboard. The difference though is that it will take the ape a time longer than the age of the universe before scene 1 is finished. In pi, it seems to be there from the beginning. Just say "the ratio of a circle's perimeter to its diameter" and here is Hamlet!
The ape's paradox was invented by Borel, who solved it in a quick-and-dirty way by arguing that small probabilities ought to be considered as impossibilities, and large probabilities as certainties [1]. Such a pragmatic approach is so surprising under the pen of a mathematician, that I cannot but understand it as a real upset from Borel's part. Apparently, he stopped working on infinity in the twenties [2].
The difference between the ape and the number pi has to do with the difference between actual and potential infinity. The ape may potentially write Hamlet if he keeps typing for some time, while Hamlet is actually in pi. The possibility of actual infinities drove mathematicians nuts, at the end of the nineteenth and early twentieth century. Quite literally for a few of them [2] who - unlike Borel - kept working on these problems.
Another reading of mine sheds other light on that issue [3]: actuality/potentiality is a dichotomy, just like empiricism/speculation, creation/evolution, etc. We love opposing things because it helps us understand; a dichotomy is a simplified model that puts order in our mind, not in our world. We are constantly shrinking the complexity of the world to make it fit in the much smaller space of what we can understand. We most often shrink a hyper-dimensional reality to a 1-dimensional segment, and eventually we ignore the segment and consider only its two ends. Potentiality and actuality are two such ends, just as finite and infinite as a matter of fact.
Choosing one of the two options in a dichotomy cannot be right or wrong. Dichotomies are fruitful because they pave the way towards a better understanding of our fundamentally non-dichotomous world. When dichotomies are no longer taken too seriously, this generally means that the phenomenon they describe is becoming better understood: solid/liquid, rational/intuitive, good/bad, wave/particle, etc. Other dichotomies are quite lively but they will no doubt end up in the same way. I would love to be there when this happens for deterministic/random, past/future, matter/spirit and dead/alive.
[1] E. Borel, Probabilité et Certitude, Presses Universitaires de France (1950);
[2] L. Graham & J.-M. Kantor, Naming Infinity, A True Story of Religious Mysticism and Mathematical Creativity, Harvard University Press (2009);
[3] S.J. Gould, Time's Arrow, Time's Cycle: Myth and Metaphor in the Discovery of Geological Time, Harvard University Press (1987);
A related paradox is that of an ape rewriting Hamlet by just typing randomly on a keyboard. The difference though is that it will take the ape a time longer than the age of the universe before scene 1 is finished. In pi, it seems to be there from the beginning. Just say "the ratio of a circle's perimeter to its diameter" and here is Hamlet!
The ape's paradox was invented by Borel, who solved it in a quick-and-dirty way by arguing that small probabilities ought to be considered as impossibilities, and large probabilities as certainties [1]. Such a pragmatic approach is so surprising under the pen of a mathematician, that I cannot but understand it as a real upset from Borel's part. Apparently, he stopped working on infinity in the twenties [2].
The difference between the ape and the number pi has to do with the difference between actual and potential infinity. The ape may potentially write Hamlet if he keeps typing for some time, while Hamlet is actually in pi. The possibility of actual infinities drove mathematicians nuts, at the end of the nineteenth and early twentieth century. Quite literally for a few of them [2] who - unlike Borel - kept working on these problems.
Another reading of mine sheds other light on that issue [3]: actuality/potentiality is a dichotomy, just like empiricism/speculation, creation/evolution, etc. We love opposing things because it helps us understand; a dichotomy is a simplified model that puts order in our mind, not in our world. We are constantly shrinking the complexity of the world to make it fit in the much smaller space of what we can understand. We most often shrink a hyper-dimensional reality to a 1-dimensional segment, and eventually we ignore the segment and consider only its two ends. Potentiality and actuality are two such ends, just as finite and infinite as a matter of fact.
Choosing one of the two options in a dichotomy cannot be right or wrong. Dichotomies are fruitful because they pave the way towards a better understanding of our fundamentally non-dichotomous world. When dichotomies are no longer taken too seriously, this generally means that the phenomenon they describe is becoming better understood: solid/liquid, rational/intuitive, good/bad, wave/particle, etc. Other dichotomies are quite lively but they will no doubt end up in the same way. I would love to be there when this happens for deterministic/random, past/future, matter/spirit and dead/alive.
[1] E. Borel, Probabilité et Certitude, Presses Universitaires de France (1950);
[2] L. Graham & J.-M. Kantor, Naming Infinity, A True Story of Religious Mysticism and Mathematical Creativity, Harvard University Press (2009);
vendredi 21 août 2009
Why does the other lane seem to be more rapid?
The answer is: “because the other lane is actually more rapid”. And you can jump from one lane to the other without changing anything to it. Whatever you do, you will still be iost of the time in the slowest lane.
It is the high density of a lane that makes it slow, because anybody adjusts the speed of his car according to the distance to the front car. For the same reason, the traffic is the most rapid where it is least dense, because there the distance to the front car is the largest. If you take a snapshop of a traffic jam from a helicopter and count the cars, you will necessarily find more slow cars than rapid cars.
Suppose that the slow lanes contain 2/3 of the cars and the quick lane 1/3. Every driver is alternatively in a quick and in a slow lane, particularly if he keeps jumping from one to another. As the fraction of slow and rapid cars is 2/3 and 1/3, a driver spends 2/3 of the time in a slow lane and only 1/3 in a rapid lane.
When you look at the other lane, 2 times out of 3 it will be more rapid than yours.
It is the high density of a lane that makes it slow, because anybody adjusts the speed of his car according to the distance to the front car. For the same reason, the traffic is the most rapid where it is least dense, because there the distance to the front car is the largest. If you take a snapshop of a traffic jam from a helicopter and count the cars, you will necessarily find more slow cars than rapid cars.
Suppose that the slow lanes contain 2/3 of the cars and the quick lane 1/3. Every driver is alternatively in a quick and in a slow lane, particularly if he keeps jumping from one to another. As the fraction of slow and rapid cars is 2/3 and 1/3, a driver spends 2/3 of the time in a slow lane and only 1/3 in a rapid lane.
When you look at the other lane, 2 times out of 3 it will be more rapid than yours.
vendredi 19 décembre 2008
Pi = C,CVEZCVBMLYZGEC...
Il y a une propriété du nombre pi qui me fait douter de son existence, ou plus modestement, qui me fait réaliser que je n’avais pas vraiment compris ce qu’est un nombre. Et cette propriété est : « pi est un nombre normal. » Cela veut dire que la suite des chiffres qui composent pi, et qui commence par 314159 à toutes les propriétés d’une séquence aléatoire infinie, où chaque chiffre apparaît avec la même fréquence 1/10.
Une conséquence de cette propriété est que toute suite de chiffres de longueur finie, comme 0123456789, est présente quelque part dans la suite des chiffres qui composent pi. Plus la séquence est longue plus sa fréquence est faible, mais comme pi est une suite infinie, on a la certitude que n’importe quelle suite finie y est présente une infinité de fois. Par exemple, la suite la 10 chiffres 0123456789 est présente en moyenne une fois tous les 10000000000 chiffres, tout comme 0000000000, ou 3141592653.
La même observation est plus choquante quand on l’exprime différemment. Quand on écrit pi = 3.1415, ce n’est qu’une manière concise d’écrire que pi est le nombre qu’on obtient en faisant le calcul 3*1+1*1/10+4*1/(10*10)+1*1/(10*10*10)+5*1/(10*10*10*10). C‘est ce qu’on appelle la base 10. Avec la numération en base 2 des ordinateurs on écrirait pi = 11,0010010… ce qui veut dire pi = 1*2+1+0*1/2+0*1/(2*2)+1*1/(2*2*2)+…Si on voulait utiliser une base plus grande que 10, il nous faut des symboles pour écrire tous les chiffres jusqu’à la valeur base. Pour écrire les nombres en base 27, par exemple, on peut convenir d’utiliser les lettres de notre alphabet et l’espace. On conviendrait 0 = «_», 1 = « A », 2 = « B», etc. jusqu’à 26 = « Z ». En base 27, et avec cette convention, les premières décimales de pi sont données dans le titre de ce post.
Une conséquence de cette propriété est que toute suite de chiffres de longueur finie, comme 0123456789, est présente quelque part dans la suite des chiffres qui composent pi. Plus la séquence est longue plus sa fréquence est faible, mais comme pi est une suite infinie, on a la certitude que n’importe quelle suite finie y est présente une infinité de fois. Par exemple, la suite la 10 chiffres 0123456789 est présente en moyenne une fois tous les 10000000000 chiffres, tout comme 0000000000, ou 3141592653.
La même observation est plus choquante quand on l’exprime différemment. Quand on écrit pi = 3.1415, ce n’est qu’une manière concise d’écrire que pi est le nombre qu’on obtient en faisant le calcul 3*1+1*1/10+4*1/(10*10)+1*1/(10*10*10)+5*1/(10*10*10*10). C‘est ce qu’on appelle la base 10. Avec la numération en base 2 des ordinateurs on écrirait pi = 11,0010010… ce qui veut dire pi = 1*2+1+0*1/2+0*1/(2*2)+1*1/(2*2*2)+…Si on voulait utiliser une base plus grande que 10, il nous faut des symboles pour écrire tous les chiffres jusqu’à la valeur base. Pour écrire les nombres en base 27, par exemple, on peut convenir d’utiliser les lettres de notre alphabet et l’espace. On conviendrait 0 = «_», 1 = « A », 2 = « B», etc. jusqu’à 26 = « Z ». En base 27, et avec cette convention, les premières décimales de pi sont données dans le titre de ce post.
La normalité d’un nombre est indépendante de la base dans laquelle il est écrit, ce qui signifie que n’importe quelle suite finie de lettres se trouve quelque part dans pi. Par exemple, on s’attend à trouver le mot « PAPA » une fois toutes les 500000 décimales. Le texte du journal de demain figure aussi quelque part dans pi, de même que toute l’oeuvre de Voltaire. Tout cela y figure même plusieurs fois, une infinité de fois ! Pire, ce n’est pas propre à pi : la plupart des nombres sont normaux. Cela veut dire que la plupart des fois que vous faites un calcul, l’œuvre de Voltaire est cachée dans la réponse.
La situation est analogue à celle du singe imaginé par Emile Borel, qui reproduirait l’œuvre de Molière en tapant au hasard à la machine à écrire. Dans ce cas là, on peut se consoler de ce que cette situation n’est pas vraiment réelle, puisqu’il faudrait au singe un temps plus long que l’age de l’univers pour ne taper qu’un début de tirade. Ce que je trouve très choquant avec la normalité des nombres, c’est que tout y serait dès le début. De manière statique, et depuis toujours. Faut-il en conclure que la plupart des nombres ne sont pas vraiment réels ?
samedi 7 juin 2008
dimanche 4 mai 2008
What recursions may come
Inscription à :
Articles (Atom)