Skip to main content

Posts tagged with 'binary'

Welcome to the latest installment of the Brief Bio series, where I'm writing up very informal biographies about major figures in the history of computers. Please take a look at the Brief Bio archive, and feel free to leave corrections and omissions in the comments.

Gottfried Wilhelm Leibniz

My series started with Blaise Pascal not because his contributions to computing were particulary significant. The Pascaline is pretty neat, but it's not programmable. Pascal contributed a lot to mathematics, science, and culture, but his effect on computing was relatively small compared to the next subject: Gottfried Wilhelm Leibniz. He was born when Pascal was 17 years old, but lived 41 years longer than the ailing Pascal. Both men were apparently fond of outrageous haircuts, which must have been quite trendy in late-Renaissance Europe.

Gottfried Wilhelm Leibniz

Like Pascal, Leibniz had a huge effect on mathematics, science, and culture. If you've heard his name before, it's quite likely you heard in your college calculus class in the context of Leibniz's Notation. Leibniz actually invented calculus (he was a contemporary of Isaac Newton, but created it independently of him).

Leibniz also spent a lot of time working on mechnical calculators. With the Pascaline in mind, he invented the Leibniz wheel, which could be hypothetically used to construct a calculator that can also do multiplication. From 1672 to 1694, he worked on a Stepped Reckoner that used the Leibniz wheel. This machine actually had some flaws which required internal fiddling to correct. So relax: if Leibniz's product had bugs, then what hope do we have of shipping bugless software?

I couldn't find any video of a Stepped Reckoner (replica or otherwise), probably because it is a flawed device. But, I did find a video that shows how the Leibniz wheel works.

But Leibniz's contributions don't stop there. The binary number system was invented by Leibniz. Yeah, the binary number system that runs all the world's technology (which Leibniz predicted as well). He described (but didn't build) a system of marbles & punch cards that could be used to perform binary calculations (full original text of De progressione Dyadica [PDF]).

One quaint view of Leibniz's that I discovered was that he was optimistic about the eventual creation of a writing system/language that would solve all problems of ethics and metaphysics. "If we had such an universal tool, we could discuss the problems of the metaphysical or the questions of ethics in the same way as the problems and questions of mathematics or geometry...two philosophers could sit down at a table and just calculating, like two mathematicians, they could say, 'Let us check it up...'" I find this idea a bit naive, but ya gotta love the optimism.

Leibniz's beliefs about language, thought, and computing would become a very rudimentary prototype for later artificial intelligent research, and he's probably the reason some of my computer science courses were actually in the philosophy college.

After Leibniz's death in 1716, his legacy was in tatters. His philosophical beliefs were under heavy criticism, and his independent discovery of calculus was doubted, and many believed that he was simply trying to take credit for something that he didn't discover independently. Like so many historical figures, his short-term legacy would eventually be outpaced to the position of respect that he holds today.

Welcome to Cross Cutting Concerns. "Weekly Concerns" is a post-a-week series of interesting links, relevant to programming and programmers. You can check out previous Weekly Concerns posts in the archive.

Matthew D. Groves

About the Author

Matthew D. Groves lives in Central Ohio. He works remotely, loves to code, and is a Microsoft MVP.

Latest Comments

Twitter