Imagine discovering that something as fundamental as the decimal point is older than we thought! Well, that’s exactly what’s happened with the recent unearthing of notes from Giovanni Bianchini, a Venetian merchant from the 15th century. It turns out that Bianchini was using decimal points in his calculations way back in the 1440s, long before historians believed.
Decimal points are those little dots we use to make sense of numbers, breaking them down into tenths, hundredths, and thousandths. While we might think of them as a modern invention, they actually have a much longer history. Before Bianchini’s time, mathematicians mostly relied on fractions, and even astronomers used a different system for decimals.
Bianchini’s notes, particularly his work on calculating stellar coordinates, reveal his use of the decimal point. This discovery challenges the idea that Christopher Clavius was the first to use decimals in 1593. It seems that Bianchini was already ahead of the game by about 150 years!
This revelation came to light when Glen Van Brummelen, a historian of mathematics, stumbled upon Bianchini’s notes while teaching a math camp for middle schoolers. He was amazed to find evidence of decimal usage dating back to the 1440s.
While Bianchini’s decimal notation didn’t immediately catch on, it likely influenced Clavius and later mathematicians. Eventually, the decimal point became a standard part of mathematical notation, thanks in part to the work of Scottish mathematician John Napier in the early 1600s.
This discovery adds a fascinating new chapter to the history of mathematics, showing that even something as seemingly simple as the decimal point has a rich and complex backstory.