December 4, 2013
In 1981, an unknown epidemic was spreading across America. In June of that year, the Centers for Disease Control and Prevention’s newsletter mentioned five cases of a strange pneumonia in Los Angeles. By July, 40 cases of a rare skin cancer were reported by doctors working in the gay communities of New York and San Francisco. By August, the Associated Press reported that two rare diseases, the skin cancer Kaposi’s sarcoma and pneumocystis, a form of pneumonia caused by a parasitic organism, had infected over 100 gay men in America, killing over half of them. At the end of 1981, 121 men had died from the strange disease; in 1982, the disease was given a name; by 1984, two different scientists had isolated the virus causing it; in 1986, that virus was named HIV. By the end of the decade, in 1989, 27,408 people died from AIDS.
In the years following the AIDS epidemic, medical research has given us a better understanding of HIV and AIDS, as well as made some remarkable breakthroughs unimagined in the 1980s: today, people living with HIV aren’t condemned to a death sentence, but rather have treatment options available. Still, to think of the AIDS epidemic in medical terms misses half of the story–the social aspect, which affected America’s perception of HIV and AIDS just as much, if not more than medical research.
The two sides of the story are told through a collection of articles, pictures, posters and pamphlets in Surviving and Thriving: AIDS, Politics and Culture, a traveling exhibit and online adaptation curated by the National Library of Medicine that explores the rise of AIDS in the early 1980s, as well as the medical and social responses to the disease since. The human reaction to the AIDS epidemic often takes a back seat to the medical narrative, but the curators of Surviving and Thriving were careful to make sure that this did not happen–through a series of digital panels, as well as a digital gallery, readers can explore how the government and other community groups talked about the disease.
At the beginning of the epidemic, response was largely limited to the communities which were most affected, especially the gay male community. “People with AIDS are really a driving force in responding to the epidemic and seeing how change is made,” says Jennifer Brier, a historian of politics and sexuality who curated the exhibit.
In 1982, Michael Callen and Richard Berkowitz, two gay men living with AIDS in New York City, published How to Have Sex in an Epidemic, which helped spread the idea that safe sex could be used as protection against spreading the epidemic–an idea that hadn’t yet become prevalent in the medical community. The pamphlet was one of the first places that proposed that men should use condoms when having sex with other men as a protection against AIDS.
Condoms as protection against AIDS became a major theme for poster campaigns. The above poster, paid for by the Baltimore-based non-profit Health Education Resource Organization, shows how visuals attempted to appeal, at least at first, to the gay community. Due to widespread misinformation, however, many people believed that AIDS was a disease that affected only white gay communities. As a response to this, black gay and lesbian communities created posters like the one below, to show that AIDS didn’t discriminate based on race.
Many posters and education campaigns harnessed sexual imagery to convey the importance of safe sex in an attempt to make safety sexy (like the Safe Sex is Hot Sex campaign), but it wasn’t a campaign tactic supported by governmental bodies–in fact, in 1987, Congress explicitly banned the use of federal funds for AIDS prevention and education campaigns that “[promoted] or [encouraged], directly or indirectly, homosexual activities” (the legislation was spearheaded by conservative senator Jesse Helms and signed into law by President Reagan).
Instead, federally-funded campaigns sought to address a large number of people from all backgrounds–male, female, homosexual or heterosexual. The America Responds to AIDS campaign, created by the CDC, ran from 1987 to 1996 and became a central part of the “everyone is at risk” message of AIDS prevention.
The campaign was met with mixed feelings by AIDS workers. “The posters really do help ameliorate the fear of hatred of people with AIDS,” Brier explains. “There’s a notion that everyone is at risk, and that’s important to talk about, but there’s also the reality that not everyone is at risk to the same extent.” Some AIDS organizations, especially those providing service to communities at the highest risk for contracting HIV, saw the campaign as diverting money and attention away from the communities that needed it the most–leaving gay and minority communities to compete with one another for the little money that remained. As New York Times reporter Jayson Blair wrote in 2001 (, “Much of the government’s $600 million AIDS-prevention budget was used…to combat the disease among college students, heterosexual women and others who faced a relatively low risk of contracting the disease.”
(This linked column by Blair was later found to be plagiarized from reporting by the Wall Street Journal, but the point still holds.)
Beyond campaigns that tried to generalize the AIDS epidemic, a different side used the fear of AIDS to try and affect change. These posters, contained under the section “Fear Mongering” in the exhibit’s digital gallery, show ominous images of graves or caskets behind proclamations of danger.
“It was like this sort of scared straight model, like if you get scared enough, you really will do what is right,” Brier says of the posters. “There were posters that focused on pleasure, or health, or positive things to get people to affect change in their behavior, but there were consistently posters that used the idea that fear could produce behavior change.”
The above poster exemplifies the tactic of fear mongering: a large, visible slogan to affect fear (and shame sexual behavior), while information on how to prevent the spread of AIDS is buried in small print at the bottom of the poster. A lack of information was typical of fear-mongering posters, which relied on catchy, scary headlines rather than information about safe sex, clean needles or the disease itself.
“The posters fed on people’s inability to understand how AIDS actually spread. It didn’t really ever mention ways to prevent the spread of HIV,” Brier says. “Fear-mongering posters don’t talk about condoms, they don’t talk about clean needles, they don’t talk about ways to be healthy. They don’t have the solutions in them, they just have the fear.”
Through exploring the exhibit, users get a sense of the different approaches public organizations took to spread information about AIDS. “It’s a fundamental question of public health,” says Brier. “Do you spread information by scaring people, do you do it by trying to tap into pleasure or do you do it by recognizing that people’s behavior isn’t just about their individual will but a whole different set of circumstances?”
November 21, 2013
“Doctor Who,” the classic British sci-fi television show, celebrates its 50th anniversary this weekend. For those who’ve never seen the program, which in the United States has aired mostly on PBS stations and, more recently, BBCAmerica, here’s a short rundown: The main character is a man called the Doctor. He’s an alien from a race called the Time Lords. He travels through time and space in a blue police box that’s really a disguise for his bigger-on-the-inside ship called the TARDIS (Time and Relative Dimension in Space). In each episode, the Doctor and a companion (or two or three) explore the universe while fighting monsters and other enemies along the way. And every so often, the doctor “regenerates,” taking on a new body and face, letting a new actor take over the lead role.
The formula has changed little since “Doctor Who” first premiered on BBC on November 23, 1963. The show has survived poor production values, the Doctor getting stranded on Earth for years, declining public interest in the show, cancellation in the late 1980s, as well as a failed attempt to reboot the series in 1996 only to come back in 2005, gaining new fans and new respect.
“Doctor Who” has been distinct from other members of the science fiction genre, such as “Star Trek,” which focused solely on the future, by taking advantage of the ability to travel through time and periodically visiting the past. This focus on history has waxed and waned over the years, reflecting the interests and wants of the show’s producers and viewers, but it produced some unique storylines centered on pivotal moments in human history. Nearly all of these episodes are available on DVD or Netflix, although two of the episodes from the Crusades are preserved only as audio.
Adventures in the first season of “Doctor Who” took viewers into historical events such as Marco Polo’s 1289 expedition to Central Asia and the Reign of Terror in late 18th-century France. Though the show’s most iconic monsters, the pepperpot-shaped Daleks, had already been introduced by this time, these history stories got their drama from human events. In “The Aztecs,” the Doctor (William Hartnell) and his companions become trapped in 15th-century Mexico. One of the companions, history teacher Barbara, is briefly hailed as a divine reincarnation of a high priest and tries to put an end to the Aztec practice of human sacrifice. Her efforts fail, and history moves on.
“Doctor Who” has frequently celebrated and explored iconic periods in British history while putting a bit of twist on them. In “The Crusade,” the Doctor (again played by William Hartnell) and his companions find themselves in 12th-century Palestine, caught in the middle of the conflict between the European crusaders, led by King Richard the Lionheart, who have conquered the land and the Saracens, led by Saladin, who are trying to kick them out. The story highlights the political machinations of the real-life leaders and the bloodthirsty nature of the Crusades themselves. The Doctor tries not to get caught up in court politics, as Richard attempts to broker a peace agreement by marrying off his sister to Saladin’s brother. But of course the Doctor fails, barely escaping a death sentence.
The Doctor may be known for traveling through time and space, but his third incarnation (played by Jon Pertwee) was banished by his fellow Time Lords to present-day Earth. Time travel stories returned, however, with the Fourth Doctor (portrayed by Tom Baker). In 1975, he and his frequent companion, journalist Sarah Jane Smith, found themselves in England in 1911 in the home of a professor who had gone missing while excavating a pyramid in Egypt. The professor had accidently released an alien named Sutekh—which fans of Egyptian history will recognize as another name for the chaos god Set—who had been locked in that pyramid by his brother Horus and their fellow Osirians. The Doctor and Sarah Jane must battle robotic mummies roaming the grounds before taking down Sutekh and saving the human race.
One of the Doctor’s greatest enemies was another Time Lord, the Master. In The King’s Demons, the Doctor (now played by Peter Davison) encounters his arch-nemesis at a medieval joust in the time of King John. In one of the Master’s smaller evil machinations—in later years, for example, the Master turns every human on Earth into a copy of himself—he tries to thwart the course of human history by provoking a rebellion that will depose King John and prevent the creation of the Magna Carta, the foundation of constitutional government in the English-speaking world. The Doctor intervenes, setting history back on course.
The Master is messing with earthlings again, this time paired with another renegade Time Lord, the Rani, in the English town of Killingworth. This is the time of the Luddites, a group of English textile workers who were protesting changes brought about by the Industrial Revolution in the early 1800s. Key to the Doctor Who story is real-life engineer and inventor of the steam locomotive engine George Stephenson, who saves the Doctor (portrayed by Colin Baker) from a group of Luddites who pushed him down a mineshaft.
History episodes became more frequent with the 2005 reboot of the “Doctor Who” franchise. The show’s producers, in their efforts to reintroduce the Doctor (played by Christopher Eccleston) to a new generation, set the entire first season on Earth. In a memorable pair of episodes, the Doctor and companion Rose find themselves in London during World War II, pursued by a creepy gas-mask-wearing child with a deadly touch. While later WWII-themed episodes feature notable historical figures from that era, including Winston Churchill and Adolf Hitler, these episodes instead center on the sad story of homeless, orphaned children who had been cast adrift amidst the chaos of the London Blitz.
The Girl in the Fireplace is a masterful marriage of futuristic science fiction with a real person from the past. The Doctor (portrayed by David Tennant) and his companions find themselves on an abandoned spaceship in the 51st century. The crew is missing, but throughout the ship are portals into 18th-century France, points in time along the life of a Frenchwoman named Reinette. The young girl grows up to become Madame de Pompadour, mistress of King Louis XV, pursued her whole life by the clockwork men of the spaceship who believe that only her brain can fix their ship.
A classic “Doctor Who” trope is to take an event in history and provide another explanation for what happened. In this case, it’s “volcano day” in the city of Pompeii. Shortly after his arrival, the Doctor (again, David Tennant) is temporarily stranded when a merchant sells his TARDIS to a local businessman, Lucius Caecilius, who thinks the blue box is a piece of avant-garde art. Caecilius was based on a real person, Lucius Caecilius Iucundus, a banker whose villa was found in excavations of the Italian town that was buried under volcanic ash in 79 A.D. In the Doctor Who version of Iucundus’ story, the explosion that likely killed him was caused not by a volcano but by the Doctor. He and his companion Donna initiate the explosion to save the world from a race of aliens, the Pyrovillians, who were living in Vesuvius and planning to take over the Earth.
The renewal of “Doctor Who” brought a new type of history episode based on literary figures. The first explained how Charles Dickens got inspired to write about ghosts at Christmas. A later story showed what happened to William Shakespeare’s missing play Love’s Labour’s Won. The third of this genre, The Unicorn and the Wasp, cleared up a mystery regarding the world’s greatest mystery writer, Agatha Christie—what happened to her during the 11 days in 1926 that she simply disappeared? In the Doctor Who story, set at a house party during the 1920s, Christie was helping the Doctor (David Tennant) solve a Christie-inspired murder mystery and then did a little traveling in the TARDIS.
While at a Van Gogh exhibit at the Musee d’Orsay in modern-day Paris, the Doctor (played by Matt Smith) notices a curious monster peeking out a window in Van Gogh’s The Church at Auvers and decides to investigate, quickly jumping back in time to visit the great painter in 1890. Scenes directly reference paintings such as Café Terrace at Night and Bedroom in Arles, while the story revolves around Van Gogh’s periods of exhaustion and depression, as well as his eventual suicide. The Doctor’s companion Amy Pond tries to avert Van Gogh’s tragic end by taking him to the exhibition where the episode began, where he can hear his work praised. But Amy is saddened to discover that her efforts had no effect, and Van Gogh eventually killed himself, as history remembers. As with all Doctor Who’s history stories, this one reminds the viewer that although the Doctor can’t change the past’s biggest events, he can bring a bit of joy and happiness to some of our saddest moments.
November 19, 2013
Click on the pins within the document to learn about some of the doctors’ findings.
In the last century, there are few events that have been studied with greater scrutiny than the assassination of President John F. Kennedy. But, that is the problem, according to author and History Channel personality Brad Meltzer.
“Put together all of the official investigations, commissions, reports, official reinvestigations, independent reviews of the evidence, journalistic inquiries, reenactments, documentaries, movies, literally thousands of books (fiction and nonfiction), not to mention countless off-the-wall and over-the-top websites, and you’ve got a situation that’s a perfect breeding ground for confusion, differing interpretations, allegations and refutations,” he writes in his latest book, History Decoded: The 10 Greatest Conspiracies of All Time.
There have been those who believe that Lee Harvey Oswald did not act alone, that there were two shooters on that fateful day in Dallas, November 22, 1963. Others have tried to pin the blame on the Soviets, the CIA and the mafia.
One natural place to look for answers is the president’s autopsy. Medical professionals at the National Naval Medical Center in Bethesda, Maryland, examined Kennedy’s body just hours after he was pronounced dead, drawing what conclusions they could from his wounds about the cause of death and location of the assassin. In Dallas, the president’s staff had hurriedly loaded his casket onto Air Force One, while city officials squabbled over a state law that required the autopsy to be performed in Texas. Just nine minutes after Lyndon Johnson took the oath of office on the plane it was wheels up.
President Lyndon Johnson gathered the Warren Commission, a group of Congressman and other prominent officials, a week later to investigate Kennedy’s assassination. The investigators, out of respect to the president’s legacy, saw neither the photographs nor the x-rays from the autopsy, though the decision to keep such medical evidence private has often been questioned. (In 1966, the Kennedy family donated these official images to the National Archives, where they remain sealed from the public.) One of the only visuals left for the group’s consideration was this descriptive autopsy sheet, or “face sheet,” which the pathologists filled out in the autopsy room, marking the figure with the two bullets’ entry and exits points. The doctors referred to these notes when writing the more detailed autopsy report.
(Photo by Apic/Getty Images)
November 6, 2013
Albert Woolson loved the parades. For Memorial Day in Duluth, Minnesota, he rode in the biggest car down the widest streets of his hometown. The city etched his name in the Duluth Honor Roll, and he was celebrated at conventions and banquets across the North. Even the president wrote him letters on his birthday. Because everyone said he was the last surviving member of the Grand Army of
the Republic, a fraternal organization of Union veterans once nearly half a million strong, they erected a life-size statue of him on the most hallowed ground of that entire horrible conflict—Gettysburg.
Though deaf and often ill, he was still spry enough that, even at 109 years of age, he could be polite and mannerly, always a gentleman. He was especially fond of children and enjoyed visiting schools and exciting the boys with stories of cannon and steel and unbelievable courage on the fields around Chattanooga. The boys called him “Grandpa Al.”
But Woolson could be fussy. His breakfast eggs had to be scrambled and his bacon crisp. He continued to smoke; he had probably lit up more than a thousand cigars just since he had hit the century mark. And no one kept him from his half-ounce of brandy before dinner.
His grandfather had served in the War of 1812, and when guns were fired on Fort Sumter in 1861, his father went off to fight for Lincoln. He lost a leg and died. So, as the story goes, young Albert, blue-eyed and blonde-haired, a mere five and a half feet tall, took his father’s place. With just a year left in the war, he enlisted as a drummer boy with the 1st Minnesota Heavy Artillery Regiment, rolling his snare as they marched south to Tennessee.
But that had been long ago, more than 90 years past. Now Albert Woolson’s days were fading, the muffled drum of his youth a softening memory. At St. Luke’s Hospital in Duluth, his health deteriorating, he would sometimes feel his old self, quoting Civil War verse or the Gettysburg Address. But then on a Saturday in late July, 1956, he slipped into a coma. Just before he drifted off, he asked a nurse’s aide for a dish of lemon sherbet. She gave him some soft candy too. As she shut the door she glanced back at her patient. “I thought he was looking very old,” she recalled. For a week he lay quietly in his hospital bed, awaiting death.
Down in Houston, old Walter Washington Williams had sent Woolson a telegram congratulating him on turning 109. “Happy birthday greetings from Colonel Walter Williams,” the wire said.
Williams was blind, nearly deaf, rail-thin, and confined to a bed in his daughter’s house. He had served as a Confederate forage master for Hood’s Brigade, they said, and now he was bound and determined to be the last on either side still alive when America’s great Civil War Centennial commemoration began in 1961. “I’m going to wait around until the others are gone,” he said, “to see what happens.”
Williams had ridden in a parade too. He was named in presidential proclamations and tributes in the press. Life magazine devoted a three-page spread to the old Rebel, including a photograph of Williams propped up on his pillows, a large Stars and Bars flag hanging on the wall. An American Legion band serenaded at his window, and he tapped his long, spindly fingers in time with “Old Soldiers Never Die.” But Williams was a Southern boy deep in his bones. He would have preferred “Cotton-Eyed Joe” on the radio:
O Lawd, O Lawd,
Come pity my case.
For I’m gettin’ old
An’ wrinkled in de face.
Like Woolson, Williams could be cantankerous. On his last birthday, when he said he was 117, they served him his favorite barbecued pork, though his daughter and a nurse had to feed him. His bed was piled high with cards and telegrams, but he could not read them. He could hardly pick them up. “I’m tired of staying here,” he complained in his son’s ear. The son smiled and told visitors how they had hunted deer together when his father was 101. “He rode a horse until he was 103,” the son said.
Williams’ last public outing was in an Armed Forces Day parade in Houston in May 1959, when he rode in an air-conditioned ambulance. As he passed the reviewing stand, he struggled to raise his arm in salute. Then they took him home and put him back to bed.
Four times he suffered bouts of pneumonia; twice they hung an oxygen tent over his bed. His doctor was doubtful, and his daughter feared the worst. “There’s too many years; too many miles,” she said.
And so the clock ticked down, not just on Albert Woolson and Walter Williams, but for a whole generation, an entire era, the closing of a searing chapter in American history: four years of brutal civil war. Like the old soldiers, memories of the North and South and how they had splintered and then remade America were slowly dying out too. Starting in the 1920s, ’30s, and ’40s, Civil War soldiers began passing away in rapid numbers, nearly three a day. The glorious reunions of proud veterans at Gettysburg and the cities of the South were coming to an end; there were too few healthy enough to attend. The Grand Army of the Republic closed its last local chapter. The Rebel yell fell silent. The campfires went dark. Echoing down the years were Gen. Robert E. Lee’s last words: “Strike the tent.”
By the start of the 1950s, about 65 of the blue and gray veterans were left; by 1955, just a half dozen. As their numbers dwindled they became artifacts of a shuttered era, curiosities of an ancient time, sepia-toned figures still inhabiting a modern world from their rocking chairs and oxygen tents. They had gone to war with rifles and sabers and in horse-mounted patrols. They had lived off hardtack and beans. Now they seemed lost in a new American century that had endured two devastating world wars fought with armored tank divisions, deadly mustard gas, and atomic bombs that fell from the sky.
Bruce Catton, long a chronicler of the Civil War, could recall his boyhood in the “pre-automobile age” of rural Michigan and how a group of old Union veterans in white whiskers and blue greatcoats had delighted his young eyes. He remembered one selling summer berries from a pail he hooked over the stub of his forearm, an arm he had lost in the Battle of the Wilderness. A church deacon had fought with the 2nd Ohio Cavalry in Virginia’s Shenandoah Valley, burning barns and killing livestock. Another had returned to Gettysburg for the 50th anniversary there, and when he arrived back by train and his buggy was late, the 70-year-old simply hoisted his bag and walked the five miles home. “They were grave, dignified, and thoughtful,” Catton would write of his hometown heroes. “For the most part they had never been 50 miles away from the farm or the dusty village streets; yet once, ages ago, they had been everywhere and had seen everything. . . . All that was real had taken place when they were young; everything after that had simply been a process of waiting for death.” Eventually, one by one the old men were carried up a small hilltop to the town cemetery. “As they departed,” Catton wrote, “we began to lose more than we knew we were losing.”
By the close of the 1950s, as the nation was preparing for the 100th anniversary of the Civil War, much of the pubic watched transfixed, marking the passing of each of the final veterans, wondering who might be the last, wondering if any would make it to the centennial, curious how anyone could live so long. Could anyone be so old?
That question seemed never more poignant than when a Confederate veteran from Georgia disrupted a Civil War museum and jabbed his cane in sudden bayonet thrusts, threatening the portraits of Yankee soldiers hanging on the wall. “Let me at him!” he yelled at a painting of Union hero Gen. William Tecumseh Sherman, the scourge of Atlanta. Sadly, the old Rebel appeared a pitiful figure, a misfit, more a caricature of himself than a gallant hero from an epic time.
Because it turns out that many of the men were not so old after all.
Many who claimed to be well over 100 and survivors of that great war were really imposters, some flat-out frauds. In truth they had been mere children and too young to march off to war in the early 1860s. Or they had not even been born. Yet as they grew old, they fabricated stories about past heroic adventures and brazenly applied for Civil War pensions during the long, lean years of the Great Depression. Some backdated their birth dates. Some made up the names of comrades and commanding officers. Some lied to their friends and neighbors and to newspapers and government officials. Over the years, some accepted so many accolades as Civil War veterans that they never could muster the courage or the humility to own up to the truth, even as they lay near death. Many ended up believing their own fabrications. Driven by money, ego, or a craving to belong to something grand and glorious, these men defrauded a nation. They especially dishonored those who had served, those who had been wounded, and above all those who had died. Many of them fooled their own families. One fooled the White House.
The last veteran who said he fought for the Union was Albert Woolson; Walter Williams said he was the last Confederate. One of them indeed was a soldier, but one, according to the best evidence, was a fake. One of them had been living a great big lie.
This is an excerpt from Last of the Blue and Grey by Richard A. Serrano, published by Smithsonian Books. Order your own copy NOW.
September 18, 2013
In 1881, Edward Charles Pickering, director of the Harvard Observatory, had a problem: the volume of data coming into his observatory was exceeding his staff’s ability to analyze it. He also had doubts about his staff’s competence–especially that of his assistant, who Pickering dubbed inefficient at cataloging. So he did what any scientist of the latter 19th century would have done: he fired his male assistant and replaced him with his maid, Williamina Fleming. Fleming proved so adept at computing and copying that she would work at Harvard for 34 years–eventually managing a large staff of assistants.
So began an era in Harvard Observatory history where women—more than 80 during Pickering’s tenure, from 1877 to his death in 1919— worked for the director, computing and cataloging data. Some of these women would produce significant work on their own; some would even earn a certain level of fame among followers of female scientists. But the majority are remembered not individually but collectively, by the moniker Pickering’s Harem.
The less-than-enlightened nickname reflects the status of women at a time when they were–with rare exception–expected to devote their energies to breeding and homemaking or to bettering their odds of attracting a husband. Education for its own sake was uncommon and work outside the home almost unheard of. Contemporary science actually warned against women and education, in the belief that women were too frail to handle the stress. As doctor and Harvard professor Edward Clarke wrote in his 1873 book Sex in Education, “A woman’s body could only handle a limited number of developmental tasks at one time—that girls who spent to much energy developing their minds during puberty would end up with undeveloped or diseased reproductive systems.”
Traditional expectations of women slowly changed; six of the “Seven Sisters” colleges began admitting students between 1865 and 1889 (Mount Holyoke opened its doors in 1837). Upper-class families encouraged their daughters to participate in the sciences, but even though women’s colleges invested more in scientific instruction, they still lagged far behind men’s colleges in access to equipment and funding for research. In a feeble attempt to remedy this inequality, progressive male educators sometimes partnered with women’s institutions.
Edward Pickering was one such progressive thinker–at least when it came to opening up educational opportunities. A native New Englander, he graduated from Harvard in 1865 and taught physics at the Massachusetts Institute of Technology, where he revolutionized the method of scientific pedagogy by encouraging students to participate in experiments. He also invited Sarah Frances Whiting, an aspiring young female scientist, to attend his lectures and to observe his experiments. Whiting used these experiences as the basis for her own teaching at Wellesley College, just 13 miles from Pickering’s classroom at MIT.
Pickering’s approach toward astronomic techniques was also progressive; instead of relying solely on notes from observations made by telescope, he emphasized examining photographs–a type of observation known today as astrophotography, which uses a camera attached to a telescope to take photos. The human eye, he reasoned, tires with prolonged observation through a telescope, and a photograph can provide a clearer view of the night sky. Moreover, photographs last much longer than bare-eye observations and notes.
Early astrophotography used the technology of the daguerreotype to transfer images from a telescope to a photographic plate. The process was involved and required long exposure time for celestial objects to appear, which frustrated astronomers. Looking for a more efficient method, Richard Maddox revolutionized photography by creating a dry plate method, which unlike the wet plates of earlier techniques, did not have to be used immediately–saving astronomers time by allowing them to use dry plates that had been prepared before the night of observing. Dry plates also allowed for longer exposure times than wet plates (which ran the risk of drying out), providing for greater light accumulation in the photographs. Though the dry plates made the prep work more efficient, their sensitivity to light still lagged behind what astronomers desired. Then, in 1878, Charles Bennett discovered a way to increase the sensitivity to light, by developing them at 32 degrees Celsius. Bennet’s discovery revolutionized astrophotography, making the photographs taken by the telescopes nearly as clear and useful as observations seen with the naked eye.
When Pickering became director of the Harvard Observatory in 1877, he lobbied for the expansion of the observatory’s astrophotography technology, but it wasn’t until the 1880s, when the technology greatly improved, that these changes were truly implemented. The prevalence of photography at the observatory rose markedly, creating a new problem: there was more data than anyone had time to interpret. The work was tedious, duties thought to lend themselves to a cheaper and less-educated workforce thought to be capable of classifying stars rather than observing them: women. By employing his female staff to engage in this work, Pickering certainly made waves in the historically patriarchal realm of academia.
But it’s hard to tout Pickering as a wholly progressive man: by limiting the assistants’ work to largely clerical duties, he reinforced the era’s common assumption that women were cut out for little more than secretarial tasks. These women, referred to as “computers,” were the only way that Pickering could achieve his goal of photographing and cataloging the entire night sky.
All told, more than 80 women worked for Pickering during his tenure at the Harvard Observatory (which extended to 1918), putting in six-day weeks poring over photographs, and earning 25 to 50 cents an hour (half what a man would have been paid). The daily work was largely clerical: some women would reduce the photographs, taking into account things like atmospheric refraction, in order to render the image as clear and unadulterated as possible. Others would classify the stars through comparing the photographs to known catalogs. Others cataloged the photographs themselves, making careful notes of each image’s date of exposure and the region of the sky. The notes were then meticulously copied into tables, which included the star’s location in the sky and its magnitude. It was a grind. As Fleming noted in her diary:
In the Astrophotographic building of the Observatory, 12 women, including myself, are engaged in the care of the photographs…. From day to day my duties at the Observatory are so nearly alike that there will be little to describe outside ordinary routine work of measurement, examination of photographs, and of work involved in the reduction of these observations.
But regardless of the unequal pay and distribution of duties, this work was incredibly important; the data provided the empirical foundations for larger astronomical theory. Pickering allowed some women to make telescopic observations, but this was the exception rather than the rule. Mostly, women were barred from producing real theoretical work and were instead relegated to analyzing and reducing the photographs. These reductions, however, served as the statistical basis for the theoretical work done by others. Chances for great advancement were extremely limited. Often the most a woman could hope for within the Harvard Observatory would be a chance to oversee less-experienced computers. That’s what Williamina Fleming was doing when, after almost 20 years at the observatory, she was appointed Curator of Astronomical Photos.
One of Pickering’s computers, however, would stand out for her contribution to astronomy: Annie Jump Cannon, who devised a system for classifying stars that is still used today. But as an article written in The Woman Citizen‘s June 1924 issue reported: “The traffic policeman on Harvard Square does not recognize her name. The brass and parades are missing. She steps into no polished limousine at the end of the day’s session to be driven by a liveried chauffeur to a marble mansion.”
Cannon was born in Dover, Delaware, on December 11, 1863. Her father, a shipbuilder, had some knowledge of the stars, but it was her mother who passed on her own childhood interest in astronomy. Both parents nourished her love of learning, and in 1880, when she enrolled at Wellesley College, she became one of the first young women from Delaware to go away to college. At Wellesley, she took classes under Whiting, and while doing graduate work there she helped Whiting conduct experiments on x-rays. But when the Harvard Observatory began to gain fame for its photographic research, Cannon transferred to Radcliffe College in order to work with Pickering, beginning in 1896. Pickering and Fleming had been working on a system for classifying stars based on their temperatures; Cannon, adding to work done by fellow computer Antonia Maury, greatly simplified that system, and in 1922, the International Astronomical Union adopted it as the official classification system for stars.
In 1938, two years before Cannon retired and three years before she died, Harvard finally acknowledged her by appointing her the William C. Bond Astronomer. During Pickering’s 42-year tenure at the Harvard Observatory, which ended only a year before he died, in 1919, he received many awards, including the Bruce Medal, the Astronomical Society of the Pacific’s highest honor. Craters on the moon and on Mars are named after him.
And Annie Jump Cannon’s enduring achievement was dubbed the Harvard—not the Cannon—system of spectral classification.
Sources: “Annals of the Astronomical Observatory of Harvard College, Volume XXIV,” on Take Note, An Exploration of Note-Taking in Harvard University Collections, 2012. Accessed September 3, 2013; “Annie Cannon (1863-1914)” on She Is An Astronomer, 2013. Accessed September 9, 2013; “Annie Jump Cannon” on Notable Name Database, 2013. Accessed September 9, 2013; “Brief History of Astrophotography” on McCormick Museum, 2009. Accessed September 18, 213; “The ‘Harvard Computers’” on WAMC, 2013. Accessed September 3, 2013; “The History of Women and Education” on the National Women’s History Museum, 207. Accessed August 19, 2013; Kate M. Tucker. “Friend to the Stars” in The Woman Citizen, June 14, 1924; Keith Lafortune. “Women at the Harvard College Observatory, 1877-1919: ‘Women’s Work,’ The ‘New’ Sociality of Astronomy, and Scientific Labor,” University of Notre Dame, December 2001. Accessed August 19, 2013; Margaret Walton Mayhall. “The Candelabrum” in The Sky. January, 1941; Moira Davison Reynolds. American Women Scientists: 23 Inspiring Biographies, 1900-2000. Jefferson, NC: McFarland & Company, 1999; “Williamina Paton Stevens Fleming (1857–1911)” on the Harvard University Library Open Collections Program, 2013. Accessed September 3, 2013.