What did Ada Lovelace do?
She is one of the most fetishized scientists today - at conventions when I'm taking sketch commissions she ranks just behind Tesla (speaking of massively if justifiably fetishized historical figures) and Newton as my most requested scientist. But when people talk about why they love Ada Lovelace, I hear a variety of responses:
"She has such a great look."
"She was Lord Byron's daughter, and how cool is that?!"
"She invented the computer, and then Charles Babbage stole the credit."
The last of these is a common but internet-popular misconception, and the first two, while true, don't really approach the main question - just what did Lovelace do?
Her intellectual fame rests upon a single work, a translation of an Italian paper about Babbage's ideas with an accompanying extended commentary of her own authorship. She wrote nothing before, and nothing after, but so important was this one paper for the creation of the modern information age that I don't suppose we'll ever stop talking about it, nor should we.
We'll get to that paper in its proper place, but we would remember Ada Lovelace (1815-1852) even without it. She was a celebrity at birth, being the daughter of international poetry superstar and sex symbol Lord Byron. Society couldn't get enough of Byron's outrageous expenditures and exploits, a curiosity that would linger after his death and open doors of all description to his daughter. The incestuous, debt-ridden, constantly philandering poet, however, only knew his daughter for one month before he was forced to flee England to evade his creditors.
He died at the age of 36 in Europe, and Ada was left to the now-oppressive, now-negligent care of her mother, who would leave Lovelace with servants for long stretches while indulging her mania for trendy health cures. She did, however, encourage Ada's ability in mathematics, hoping that it would curb the dark poetic fancy she inherited from her father. Prone to chronic physical ailments, Ada was unable to attend traditional school and, like Julia Robinson, had an extended spell of critical illness during her adolescence that kept her bed-ridden for three agonizing years.
She recovered, and continued her practical education under her mother's stringent if distant guidance, but she needed an outlet for her prodigious intellectual energies, and found it at last in 19th century England's most Impossible Thing, the Difference Engine of Charles Babbage. In 1833, Ada Byron met Charles Babbage. She was just under eighteen years of age and he was just coming off the successful demonstration of a working model of one component of his proposed Difference Engine, which would be able to automatically calculate data tables using the method of divided difference.
Using such an engine, one could produce automatically the vast tables of logarithms that were needed by England's flourishing industrial and economic sectors to do their business. If you're of a Certain Age, you might remember having to carry a book of logarithms with you to science or math class. If not, ask your parents. They will tell you the blood-curdling tale of how they had to solve common arithmetic problems, one of which I'll go into some detail with here, both so that we can all appreciate how lucky we are to live in the age of calculators, and to demonstrate why Babbage was designing a machine to generate logarithmic and other mathematical tables.
Here's a problem your parents might have dealt with:
8848 ÷ 158.
For the modern student, this presents no difficulty whatsoever. Put it into a calculator and you're good. But fifty years ago, you had to use manual methods, including long division which everybody, absolutely everybody, hates, and which would require creating a list of the multiples of 158, which is also just demonstrably The Worst. This is where logarithm tables come in. I basically want to solve
8848 ÷ 158 = x.
So, I'll take the log of both sides:
log (8848 ÷ 158) = log (x).
This, by the logarithm properties we all learned in high school, reduces down to:
log (8848) - log (158) = log (x).
Those first two are values I can now just look up. Here are the entries from my grandfather's 1947 engineering table book:
Tracing 88 on the left, then to the 4 column on the middle, and then finally to the 8 column on the far right, I get that log (8.848) = .9465 + .0004 = .9469. But I didn't want log (8.848), I wanted log (8848). But that's okay, 8848 = 8.848 x 1000, so log (8848) = log (8.848 x 1000) = log (8.848) + log (1000) = .9469 + 3 (since 10^3 = 1000) = 3.9469.
Using the same method, and the second excerpt, you should get that log (158) = 2.1987. (You get that log (1.58) = .1987, then we need to multiply by two powers of ten, so we just add 2 to the result to do that, just like above, where we added three to multiply the answer by a thousand!)
So, now we have 3.9469 - 2.1987 = log (x), or 1.7482 = log (x). Our terrible division equation has changed into a normal subtraction equation, to solve which we just need to look at an antilogarithm table. Looking up .7482 on the table I get 5.598 + .003 = 5.601. So, 10^.7482 = 5.601, meaning 10^1.7482 = 56.01, so x = 56.01, and at last we have our answer, after taking rounding into consideration, that 8848 ÷ 158 = 56.
Ta-da.
That might seem cumbersome and horrid, but once you get used to the tables, you can calculate division, multiplication, powers, and roots of just about anything you want within a minute… provided that you have good tables. To make those tables before Babbage required humans sitting at desks crunching and crunching away to get the desired accuracy. It was a massive and tedious process that devoured time and money, so when Babbage approached the British government with a way to automate the process, they threw cash at him in torrents, enough to allow him to build a section of his Difference Engine.
It was pretty great, but it would require massive numbers of precision-crafted cogs to complete, and British industry was simply not up to the task. In fact, the machine would remain unbuilt until 1991. But none of that mattered to Ada. She saw a titanic undertaking with the potential of changing just about every aspect of human life. When Babbage started describing the successor to the Difference Engine, the Analytic Engine, she saw that, much more than the equation crunching device that Babbage envisioned, it could become an all-purpose tool for performing any task that could be broken down into a series of algorithmic mechanical steps.
She wanted to see Babbage’s vision realized, and set herself the task of catching up in a few years with the mathematical knowledge Babbage had taken a lifetime to acquire. It would have been an audacious enough task on her own, but on top of that she had to deal with tutors who were active hindrances, who judged that she was too frail for rigorous mathematical work and thus tried actively to lower her pace and divert her to genteel mathematics.
Fortunately for us, Lovelace stuck to her vision, and by 1843 understood the deep mathematical analysis behind the Analytic Engine nearly as well as Babbage himself, and was starting to push the theoretical structure past the boundaries of its original conception. This was the year that she translated Menabrea’s article on Babbage’s work, and tripled its length by adding her own commentary about the potential of automatic calculation to remake the world. She described the punchcard mechanism which allowed the machine to receive continuous instructions about data values, variable storage, and operations to perform, delved into the mathematics of numerical function analysis and the intimidating practicalities of its mechanical implementation, and even wrote an algorithm that the machine could use to compute Bernoulli numbers, thereby, in the estimation of some historians, creating the world’s first published computer program for a machine that only theoretically existed at the time.
(Now, to be fair, there is a dedicated group of scientific historians out there who disagree that Note G, which contains Ada’s Bernoulli program, is the First Computer Program. There are those who say that Ada did create the Bernoulli program, but that it doesn’t matter, because the honor of first programmer should be given to Joseph Marie Jacquard (1752 - 1834) for his use of punchcards to program looms in 1804, which was itself an outgrowth of Basile Bouchon’s 1725 use of punched paper strips to automate thread selection. Then there are those who say that Note G is probably the first computer program, but that the Bernoulli algorithm in it is Babbage’s from 1837, not Ada’s, and that therefore he should receive credit for it.)
The vision expressed in that paper, of breaking a problem down to operations and variables, and encoding a solution procedure on a set of carded instructions that could then be mechanically worked upon to arrive at an ultimate solution, took the raw machinery and power of Babbage’s vision and applied it to the world at large, to music and philosophy, to anything that might admit of an ordered, rules-based approach. It was an imaginative leap of importance made all the more compelling by the inclusion of a detailed algorithm for achieving a practical result automatically.
That could have been the start of something transcendent, but it turned out to be the end. Babbage’s prickly persecution complex made enemies of the people whose support he desperately needed, culminating in a disastrous meeting with Prime Minister Robert Peel, and when Lovelace offered to take over the public development and practical organization of building the Engines, he flatly refused her help. They would remain friendly the rest of their lives, but her public role in advancing computational theory was over just as it began.
Lovelace’s creative energies thus blocked from expression, she threw herself somewhat hopelessly into a love affair and a major gambling addiction that all but bankrupted her while her husband, the generally useless if affable Count William Lovelace, watched on. And then, as if not hobbled enough by ill fortune, sickness struck, a uterine cancer that forced her to choose between pain and thought. She hated how the laudanum and opium needed to reduce the grinding pain clouded her mind, but as her body turned precipitously against her, leaving her confined to the couch and a wheelchair, there was no alternative, and her letters from this time show a heartbreaking lack of focus and clarity.
The woman who was once toasted as the greatest female scientific mind of her time (next to Mary Somerville, her long-time friend) ended her days in a lonely pall. Her husband avoided her, as did her mother. After a particularly somber last visit, she ordered Babbage not to see her again, so ashamed had she become of her frailty. After three years of suffering, she died at last in 1852 at the age of 36, the same as that of her infamous father, beside whom, in an act of final rebellion against her controlling mother, she requested to be buried.
FURTHER READING:
If you’ve been reading my Women in Science column here and there over the last decade, you’ll have been subjected to my intermittent bemoaning of the fact that there has yet to be an Ada Lovelace biography that really hits the nail on the head in terms of providing a good mix of who she was, the struggles she faced, and what she accomplished. That isn’t to say that the books on her that have been published in the last forty years have been bad, merely that the balance hasn’t been quite right, at least for me. Essinger’s Ada’s Algorithm (2014) didn’t have quite enough Ada in it, and both Stein’s Ada: A Life and Legacy (1986) and Toole’s Ada: The Enchantress of Numbers (1998) were a bit too polarized in their takes, albeit in very different directions.
After Essinger, there was an explosion of books on Lovelace, including Dreaming in Code: Ada Byron Lovelace, Computer Pioneer (2022), Ada Lovelace: The Making of a Computer Scientist (2018), Ada Lovelace: A Life from Beginning to End (2019), In Byron’s Wake: The Turbulent Lives of Lord Byron’s Wife and Daughter (2018), Charles and Ada (2019, also by Essinger), and the novels Enchantress of Numbers: A Novel of Ada Lovelace (2018), and Ada Lovelace: The Countess Who Dreamed in Numbers (2019). Most recently, Beverly Adams has written Ada Lovelace: The World’s First Computer Programmer (2023), which, psychologically, I think is right on the nose, but steadfastly avoids talking about the controversies surrounding her place in the larger computer programming pantheon, which generated controversy all its own when the book was released.
Interesting.
Only one almost entirely unrelated comment : neither of my parents struggled with math; my father enjoyed it.
My mother had a great, great aunt, Adelaide Hobe, samples of her math can be found in U.C. Berkeley, the Oakland ( CA ) main library history room - and in her file in the James Lick ( observatory ) archives