This Day in History
On this day in 1969, America’s first automatic teller machine (ATM) makes its public debut, dispensing cash to customers at Chemical Bank in Rockville Center, New York. ATMs went on to revolutionize the banking industry, eliminating the need to visit a bank to conduct basic financial transactions. By the 1980s, these money machines had become widely popular and handled many of the functions previously performed by human tellers, such as check deposits and money transfers between accounts. Today, ATMs are as indispensable to most people as cell phones and e-mail.
Several inventors worked on early versions of a cash-dispensing machine, but Don Wetzel, an executive at Docutel, a Dallas company that developed automated baggage-handling equipment, is generally credited as coming up with the idea for the modern ATM. Wetzel reportedly conceived of the concept while waiting on line at a bank. The ATM that debuted in New York in 1969 was only able to give out cash, but in 1971, an ATM that could handle multiple functions, including providing customers’ account balances, was introduced.
ATMs eventually expanded beyond the confines of banks and today can be found everywhere from gas stations to convenience stores to cruise ships. There is even an ATM at McMurdo Station in Antarctica. Non-banks lease the machines (so-called “off premise” ATMs) or own them outright.
Today there are well over 1 million ATMs around the world, with a new one added approximately every five minutes. It’s estimated that more than 170 Americans over the age of 18 had an ATM card in 2005 and used it six to eight times a month. Not surprisingly, ATMs get their busiest workouts on Fridays.
In the 1990s, banks began charging fees to use ATMs, a profitable move for them and an annoying one for consumers. Consumers were also faced with an increase in ATM crimes and scams. Robbers preyed on people using money machines in poorly lit or otherwise unsafe locations, and criminals also devised ways to steal customers’ PINs (personal identification numbers), even setting up fake money machines to capture the information. In response, city and state governments passed legislation such as New York’s ATM Safety Act in 1996, which required banks to install such things as surveillance cameras, reflective mirrors and locked entryways for their ATMs.
On this day in 1864, Union Army General William Tecumseh Sherman lays siege to Atlanta, Georgia, a critical Confederate hub, shelling civilians and cutting off supply lines. The Confederates retreated, destroying the city’s munitions as they went. On November 15 of that year, Sherman’s troops burned much of the city before continuing their march through the South. Sherman’s Atlanta campaign was one of the most decisive victories of the Civil War.
William Sherman, born May 8, 1820, in Lancaster, Ohio, attended West Point and served in the army before becoming a banker and then president of a military school in Louisiana. When the Civil War broke out in 1861 after 11 Southern slave states seceded from the Union, Sherman joined the Union Army and eventually commanded large numbers of troops, under General Ulysses S. Grant, at the battles of Shiloh (1862), Vicksburg (1863) and Chattanooga (1863). In the spring of 1864, Sherman became supreme commander of the armies in the West and was ordered by Grant to take the city of Atlanta, then a key military supply center and railroad hub for the Confederates.
Sherman’s Atlanta campaign began on May 4, 1864, and in the first few months his troops engaged in several fierce battles with Confederate soldiers on the outskirts of the city, including the Battle of Kennesaw Mountain, which the Union forces lost. However, on September 1, Sherman’s men successfully captured Atlanta and continued to defend it through mid-November against Confederate forces led by John Hood. Before he set off on his famous March to the Sea on November 15, Sherman ordered that Atlanta’s military resources, including munitions factories, clothing mills and railway yards, be burned. The fire got out of control and left Atlanta in ruins.
Sherman and 60,000 of his soldiers then headed toward Savannah, Georgia, destroying everything in their path that could help the Confederates. They captured Savannah and completed their March to the Sea on December 23, 1864. The Civil War ended on April 9, 1865, when the Confederate commander in chief, Robert E. Lee, surrendered to Grant at Appomattox Courthouse, Virginia.
After the war, Sherman succeeded Grant as commander in chief of the U.S. Army, serving from 1869 to 1883. Sherman, who is credited with the phrase “war is hell,” died February 14, 1891, in New York City. The city of Atlanta swiftly recovered from the war and became the capital of Georgia in 1868, first on a temporary basis and then permanently by popular vote in 1877.
On this day in 1980, representatives of the communist government of Poland agree to the demands of striking shipyard workers in the city of Gdansk. Former electrician Lech Walesa led the striking workers, who went on to form Solidarity, the first independent labor union to develop in a Soviet bloc nation.
In July 1980, facing economic crisis, Poland’s government raised the price of food and other goods, while curbing the growth of wages. The price hikes made it difficult for many Poles to afford basic necessities, and a wave of strikes swept the country. Amid mounting tensions, a popular forklift operator named Anna Walentynowicz was fired from the Lenin Shipyard in the northern Polish city of Gdansk. In mid-August, some 17,000 of the shipyard’s workers began a sit-down strike to campaign for her reinstatement, as well as for a modest increase in wages. They were led by the former shipyard electrician Lech Walesa, who had himself been fired for union activism four years earlier.
Despite governmental censorship and attempts to keep news of the strike from getting out, similar protests broke out in industrial cities throughout Poland. On August 17, an Interfactory Strike Committee presented the Polish government with 21 ambitious demands, including the right to organize independent trade unions, the right to strike, the release of political prisoners and increased freedom of expression. Fearing the general strike would lead to a national revolt, the government sent a commission to Gdansk to negotiate with the rebellious workers. On August 31, Walesa and Deputy Premier Mieczyslaw Jagielski signed an agreement giving in to many of the workers’ demands. Walesa signed the document with a giant ballpoint pen decorated with a picture of the newly elected Pope John Paul II (Karol Wojtyla, the former archbishop of Krakow).
In the wake of the Gdansk strike, leaders of the Interfactory Strike Committee voted to create a single national trade union known as Solidarnosc (Solidarity), which soon evolved into a mass social movement, with a membership of more than 10 million people. Solidarity attracted sympathy from Western leaders and hostility from Moscow, where the Kremlin considered a military invasion of Poland. In late 1981, under Soviet pressure, the government of General Wojciech Jaruzelski annulled the recognition of Solidarity and declared martial law in Poland. Some 6,000 Solidarity activists were arrested, including Walesa, who was detained for almost a year. The Solidarity movement moved underground, where it continued to enjoy support from international leaders such as U.S. President Ronald Reagan, who imposed sanctions on Poland. Walesa was awarded the 1983 Nobel Peace Prize, and after the fall of communism in 1989 he became the first president of Poland ever to be elected by popular vote.
On this day in 1967, Thurgood Marshall becomes the first African American to be confirmed as a Supreme Court justice. He would remain on the Supreme Court for 24 years before retiring for health reasons, leaving a legacy of upholding the rights of the individual as guaranteed by the U.S. Constitution.
From a young age, Marshall seemed destined for a place in the American justice system. His parents instilled in him an appreciation for the Constitution, a feeling that was reinforced by his schoolteachers, who forced him to read the document as punishment for his misbehavior. After graduating from Lincoln University in 1930, Marshall sought admission to the University of Maryland School of Law, but was turned away because of the school’s segregation policy, which effectively forbade blacks from studying with whites. Instead, Marshall attended Howard University Law School, from which he graduated magna cum laude in 1933. (Marshall later successfully sued Maryland School of Law for their unfair admissions policy.)
Setting up a private practice in his home state of Maryland, Marshall quickly established a reputation as a lawyer for the “little man.” In a year’s time, he began working with the Baltimore NAACP (National Association for the Advancement of Colored People), and went on to become the organization’s chief counsel by the time he was 32, in 1940. Over the next two decades, Marshall distinguished himself as one of the country’s leading advocates for individual rights, winning 29 of the 32 cases he argued in front of the Supreme Court, all of which challenged in some way the ‘separate but equal’ doctrine that had been established by the landmark case Plessy v. Ferguson (1896). The high-water mark of Marshall’s career as a litigator came in 1954 with his victory in Brown v. Board of Education of Topeka. In that case, Marshall argued that the ‘separate but equal’ principle was unconstitutional, and designed to keep blacks “as near [slavery] as possible.”
In 1961, Marshall was appointed by then-President John F. Kennedy to the U.S. Court of Appeals for the Second Circuit, a position he held until 1965, when Kennedy’s successor, Lyndon B. Johnson, named him solicitor general. Following the retirement of Justice Tom Clark in 1967, President Johnson appointed Marshall to the Supreme Court, a decision confirmed by the Senate with a 69-11 vote. Over the next 24 years, Justice Marshall came out in favor of abortion rights and against the death penalty, as he continued his tireless commitment to ensuring equitable treatment of individuals–particularly minorities–by state and federal governments.
Hurricane Katrina makes landfall near New Orleans, Louisiana, as a Category 4 hurricane on this day in 2005. Despite being only the third most powerful storm of the 2005 hurricane season, Katrina was the worst natural disaster in the history of the United States. After briefly coming ashore in southern Florida on August 25 as a Category 1 hurricane, Katrina gained strength before slamming into the Gulf Coast on August 29. In addition to bringing devastation to the New Orleans area, the hurricane caused damage along the coasts of Mississippi and Alabama, as well as other parts of Louisiana.
New Orleans Mayor Ray Nagin ordered a mandatory evacuation of the city on August 28, when Katrina briefly achieved Category 5 status and the National Weather Service predicted “devastating” damage to the area. But an estimated 150,000 people, who either did not want to or did not have the resources to leave, ignored the order and stayed behind. The storm brought sustained winds of 145 miles per hour, which cut power lines and destroyed homes, even turning cars into projectile missiles. Katrina caused record storm surges all along the Mississippi Gulf Coast. The surges overwhelmed the levees that protected New Orleans, located at six feet below sea level, from Lake Pontchartrain and the Mississippi River. Soon, 80 percent of the city was flooded up to the rooftops of many homes and small buildings.
Tens of thousands of people sought shelter in the New Orleans Convention Center and the Louisiana Superdome. The situation in both places quickly deteriorated, as food and water ran low and conditions became unsanitary. Frustration mounted as it took up to two days for a full-scale relief effort to begin. In the meantime, the stranded residents suffered from heat, hunger, and a lack of medical care. Reports of looting, rape, and even murder began to surface. As news networks broadcast scenes from the devastated city to the world, it became obvious that a vast majority of the victims were African-American and poor, leading to difficult questions among the public about the state of racial equality in the United States. The federal government and President George W. Bush were roundly criticized for what was perceived as their slow response to the disaster. The head of the Federal Emergency Management Agency (FEMA), Michael Brown, resigned amid the ensuing controversy.
Finally, on September 1, the tens of thousands of people staying in the damaged Superdome and Convention Center begin to be moved to the Astrodome in Houston, Texas, and another mandatory evacuation order was issued for the city. The next day, military convoys arrived with supplies and the National Guard was brought in to bring a halt to lawlessness. Efforts began to collect and identify corpses. On September 6, eight days after the hurricane, the Army Corps of Engineers finally completed temporary repairs to the three major holes in New Orleans’ levee system and were able to begin pumping water out of the city.
In all, it is believed that the hurricane caused more than 1,300 deaths and up to $150 billion in damages to both private property and public infrastructure. It is estimated that only about $40 billion of that number will be covered by insurance. One million people were displaced by the disaster, a phenomenon unseen in the United States since the Great Depression. Four hundred thousand people lost their jobs as a result of the disaster. Offers of international aid poured in from around the world, even from poor countries like Bangladesh and Sri Lanka. Private donations from U.S. citizens alone approached $600 million.
The storm also set off 36 tornadoes in Mississippi, Alabama, Georgia, Pennsylvania, and Virginia, resulting in one death.
President Bush declared September 16 a national day of remembrance for the victims of Hurricane Katrina.
After four years of separation, Charles, Prince of Wales and heir to the British throne, and his wife, Princess Diana, formally divorce.
On July 29, 1981, nearly one billion television viewers in 74 countries tuned in to witness the marriage of Prince Charles, heir to the British throne, to Lady Diana Spencer, a young English schoolteacher. Married in a grand ceremony at St. Paul’s Cathedral in the presence of 2,650 guests, the couple’s romance was, for the moment, the envy of the world. Their first child, Prince William, was born in 1982, and their second, Prince Harry, in 1984.
Before long, however, the fairy tale couple grew apart, an experience that was particularly painful under the ubiquitous eyes of the world’s tabloid media. Diana and Charles announced a separation in 1992, though they continued to carry out their royal duties. In August 1996, two months after Queen Elizabeth II urged the couple to divorce, the prince and princess reached a final agreement. In exchange for a generous settlement, and the right to retain her apartments at Kensington Palace and her title of “Princess of Wales,” Diana agreed to relinquish the title of “Her Royal Highness” and any future claims to the British throne.
In the year following the divorce, the popular princess seemed well on her way to achieving her dream of becoming “a queen in people’s hearts,” but on August 31, 1997, she was killed with her companion Dodi Fayed in a car accident in Paris. An investigation conducted by the French police concluded that the driver, who also died in the crash, was heavily intoxicated and caused the accident while trying to escape the paparazzi photographers who consistently tailed Diana during any public outing.
Prince Charles married his longtime mistress, Camilla Parker Bowles, on April 9, 2005.
The most powerful volcanic eruption in recorded history occurs on Krakatau (also called Krakatoa), a small, uninhabited volcanic island located west of Sumatra in Indonesia, on this day in 1883. Heard 3,000 miles away, the explosions threw five cubic miles of earth 50 miles into the air, created 120-foot tsunamis and killed 36,000 people.
Krakatau exhibited its first stirrings in more than 200 years on May 20, 1883. A German warship passing by reported a seven-mile high cloud of ash and dust over Krakatau. For the next two months, similar explosions would be witnessed by commercial liners and natives on nearby Java and Sumatra. With little to no idea of the impending catastrophe, the local inhabitants greeted the volcanic activity with festive excitement.
On August 26 and August 27, excitement turned to horror as Krakatau literally blew itself apart, setting off a chain of natural disasters that would be felt around the world for years to come. An enormous blast on the afternoon of August 26 destroyed the northern two-thirds of the island; as it plunged into the Sunda Strait, between the Java Sea and Indian Ocean, the gushing mountain generated a series of pyroclastic flows (fast-moving fluid bodies of molten gas, ash and rock) and monstrous tsunamis that swept over nearby coastlines. Four more eruptions beginning at 5:30 a.m. the following day proved cataclysmic. The explosions could be heard as far as 3,000 miles away, and ash was propelled to a height of 50 miles. Fine dust from the explosion drifted around the earth, causing spectacular sunsets and forming an atmospheric veil that lowered temperatures worldwide by several degrees.
Of the estimated 36,000 deaths resulting from the eruption, at least 31,000 were caused by the tsunamis created when much of the island fell into the water. The greatest of these waves measured 120 feet high, and washed over nearby islands, stripping away vegetation and carrying people out to sea. Another 4,500 people were scorched to death from the pyroclastic flows that rolled over the sea, stretching as far as 40 miles, according to some sources.
In addition to Krakatau, which is still active, Indonesia has another 130 active volcanoes, the most of any country in the world.
On this day in 1939, the first televised Major League baseball game is broadcast on station W2XBS, the station that was to become WNBC-TV. Announcer Red Barber called the game between the Cincinnati Reds and the Brooklyn Dodgers at Ebbets Field in Brooklyn, New York.
At the time, television was still in its infancy. Regular programming did not yet exist, and very few people owned television sets–there were only about 400 in the New York area. Not until 1946 did regular network broadcasting catch on in the United States, and only in the mid-1950s did television sets become more common in the American household.
In 1939, the World’s Fair–which was being held in New York–became the catalyst for the historic broadcast. The television was one of fair’s prize exhibits, and organizers believed that the Dodgers-Reds doubleheader on August 26 was the perfect event to showcase America’s grasp on the new technology.
By today’s standards, the video coverage was somewhat crude. There were only two stationary camera angles: The first was placed down the third base line to pick up infield throws to first, and the second was placed high above home plate to get an extensive view of the field. It was also difficult to capture fast-moving plays: Swinging bats looked like paper fans, and the ball was all but invisible during pitches and hits.
Nevertheless, the experiment was a success, driving interest in the development of television technology, particularly for sporting events. Though baseball owners were initially concerned that televising baseball would sap actual attendance, they soon warmed to the idea, and the possibilities for revenue generation that came with increased exposure of the game, including the sale of rights to air certain teams or games and television advertising.
Today, televised sports is a multi-billion dollar industry, with technology that gives viewers an astounding amount of visual and audio detail. Cameras are now so precise that they can capture the way a ball changes shape when struck by a bat, and athletes are wired to pick up field-level and sideline conversation.
On this day in 1835, the first in a series of six articles announcing the supposed discovery of life on the moon appears in the New York Sun newspaper.
Known collectively as “The Great Moon Hoax,” the articles were supposedly reprinted from the Edinburgh Journal of Science. The byline was Dr. Andrew Grant, described as a colleague of Sir John Herschel, a famous astronomer of the day. Herschel had in fact traveled to Capetown, South Africa, in January 1834 to set up an observatory with a powerful new telescope. As Grant described it, Herschel had found evidence of life forms on the moon, including such fantastic animals as unicorns, two-legged beavers and furry, winged humanoids resembling bats. The articles also offered vivid description of the moon’s geography, complete with massive craters, enormous amethyst crystals, rushing rivers and lush vegetation.
The New York Sun, founded in 1833, was one of the new “penny press” papers that appealed to a wider audience with a cheaper price and a more narrative style of journalism. From the day the first moon hoax article was released, sales of the paper shot up considerably. It was exciting stuff, and readers lapped it up. The only problem was that none of it was true. The Edinburgh Journal of Science had stopped publication years earlier, and Grant was a fictional character. The articles were most likely written by Richard Adams Locke, a Sun reporter educated at Cambridge University. Intended as satire, they were designed to poke fun at earlier, serious speculations about extraterrestrial life, particularly those of Reverend Thomas Dick, a popular science writer who claimed in his bestselling books that the moon alone had 4.2 billion inhabitants.
Readers were completely taken in by the story, however, and failed to recognize it as satire. The craze over Herschel’s supposed discoveries even fooled a committee of Yale University scientists, who traveled to New York in search of the Edinburgh Journal articles. After Sun employees sent them back and forth between the printing and editorial offices, hoping to discourage them, the scientists returned to New Haven without realizing they had been tricked.
On September 16, 1835, the Sun admitted the articles had been a hoax. People were generally amused by the whole thing, and sales of the paper didn’t suffer. The Sun continued operation until 1950, when it merged with the New York World-Telegram. The merger folded in 1967. A new New York Sun newspaper was founded in 2002, but it has no relation to the original.
After centuries of dormancy, Mount Vesuvius erupts in southern Italy, devastating the prosperous Roman cities of Pompeii and Herculaneum and killing thousands. The cities, buried under a thick layer of volcanic material and mud, were never rebuilt and largely forgotten in the course of history. In the 18th century, Pompeii and Herculaneum were rediscovered and excavated, providing an unprecedented archaeological record of the everyday life of an ancient civilization, startlingly preserved in sudden death.
The ancient cities of Pompeii and Herculaneum thrived near the base of Mount Vesuvius at the Bay of Naples. In the time of the early Roman Empire, 20,000 people lived in Pompeii, including merchants, manufacturers, and farmers who exploited the rich soil of the region with numerous vineyards and orchards. None suspected that the black fertile earth was the legacy of earlier eruptions of Mount Vesuvius. Herculaneum was a city of 5,000 and a favorite summer destination for rich Romans. Named for the mythic hero Hercules, Herculaneum housed opulent villas and grand Roman baths. Gambling artifacts found in Herculaneum and a brothel unearthed in Pompeii attest to the decadent nature of the cities. There were smaller resort communities in the area as well, such as the quiet little town of Stabiae.
At noon on August 24, 79 A.D., this pleasure and prosperity came to an end when the peak of Mount Vesuvius exploded, propelling a 10-mile mushroom cloud of ash and pumice into the stratosphere. For the next 12 hours, volcanic ash and a hail of pumice stones up to 3 inches in diameter showered Pompeii, forcing the city’s occupants to flee in terror. Some 2,000 people stayed in Pompeii, holed up in cellars or stone structures, hoping to wait out the eruption.
A westerly wind protected Herculaneum from the initial stage of the eruption, but then a giant cloud of hot ash and gas surged down the western flank of Vesuvius, engulfing the city and burning or asphyxiating all who remained. This lethal cloud was followed by a flood of volcanic mud and rock, burying the city.
The people who remained in Pompeii were killed on the morning of August 25 when a cloud of toxic gas poured into the city, suffocating all that remained. A flow of rock and ash followed, collapsing roofs and walls and burying the dead.
Much of what we know about the eruption comes from an account by Pliny the Younger, who was staying west along the Bay of Naples when Vesuvius exploded. In two letters to the historian Tacitus, he told of how “people covered their heads with pillows, the only defense against a shower of stones,” and of how “a dark and horrible cloud charged with combustible matter suddenly broke and set forth. Some bewailed their own fate. Others prayed to die.” Pliny, only 17 at the time, escaped the catastrophe and later became a noted Roman writer and administrator. His uncle, Pliny the Elder, was less lucky. Pliny the Elder, a celebrated naturalist, at the time of the eruption was the commander of the Roman fleet in the Bay of Naples. After Vesuvius exploded, he took his boats across the bay to Stabiae, to investigate the eruption and reassure terrified citizens. After going ashore, he was overcome by toxic gas and died.
According to Pliny the Younger’s account, the eruption lasted 18 hours. Pompeii was buried under 14 to 17 feet of ash and pumice, and the nearby seacoast was drastically changed. Herculaneum was buried under more than 60 feet of mud and volcanic material. Some residents of Pompeii later returned to dig out their destroyed homes and salvage their valuables, but many treasures were left and then forgotten.
In the 18th century, a well digger unearthed a marble statue on the site of Herculaneum. The local government excavated some other valuable art objects, but the project was abandoned. In 1748, a farmer found traces of Pompeii beneath his vineyard. Since then, excavations have gone on nearly without interruption until the present. In 1927, the Italian government resumed the excavation of Herculaneum, retrieving numerous art treasures, including bronze and marble statues and paintings.
The remains of 2,000 men, women, and children were found at Pompeii. After perishing from asphyxiation, their bodies were covered with ash that hardened and preserved the outline of their bodies. Later, their bodies decomposed to skeletal remains, leaving a kind of plaster mold behind. Archaeologists who found these molds filled the hollows with plaster, revealing in grim detail the death pose of the victims of Vesuvius. The rest of the city is likewise frozen in time, and ordinary objects that tell the story of everyday life in Pompeii are as valuable to archaeologists as the great unearthed statues and frescoes. It was not until 1982 that the first human remains were found at Herculaneum, and these hundreds of skeletons bear ghastly burn marks that testifies to horrifying deaths.
Today, Mount Vesuvius is the only active volcano on the European mainland. Its last eruption was in 1944 and its last major eruption was in 1631. Another eruption is expected in the near future, would could be devastating for the 700,000 people who live in the “death zones” around Vesuvius.
On this day in 1902, pioneering cookbook author Fannie Farmer, who changed the way Americans prepare food by advocating the use of standardized measurements in recipes, opens Miss Farmer’s School of Cookery in Boston. In addition to teaching women about cooking, Farmer later educated medical professionals about the importance of proper nutrition for the sick.
Farmer was born March 23, 1857, and raised near Boston, Massachusetts. Her family believed in education for women and Farmer attended Medford High School; however, as a teenager she suffered a paralytic stroke that turned her into a homebound invalid for a period of years. As a result, she was unable to complete high school or attend college and her illness left her with a permanent limp. When she was in her early 30s, Farmer attended the Boston Cooking School. Founded in 1879, the school promoted a scientific approach to food preparation and trained women to become cooking teachers at a time when their employment opportunities were limited. Farmer graduated from the program in 1889 and in 1891 became the school’s principal. In 1896, she published her first cookbook, The Boston Cooking School Cookbook, which included a wide range of straightforward recipes along with information on cooking and sanitation techniques, household management and nutrition. Farmer’s book became a bestseller and revolutionized American cooking through its use of precise measurements, a novel culinary concept at the time.
In 1902, Farmer left the Boston Cooking School and founded Miss Farmer’s School of Cookery. In addition to running her school, she traveled to speaking engagements around the U.S. and continued to write cookbooks. In 1904, she published Food and Cookery for the Sick and Convalescent, which provided food recommendations for specific diseases, nutritional information for children and information about the digestive system, among other topics. Farmer’s expertise in the areas of nutrition and illness led her to lecture at Harvard Medical School.
Farmer died January 15, 1915, at age 57. After Farmer’s death, Alice Bradley, who taught at Miss Farmer’s School of Cookery, took over the business and ran it until the mid-1940s. The Fannie Farmer Cookbook is still in print today.
On this day in 1950, officials of the United States Lawn Tennis Association (USLTA) accept Althea Gibson into their annual championship at Forest Hills, New York, making her the first African-American player to compete in a U.S. national tennis competition.
Growing up in Harlem, the young Gibson was a natural athlete. She started playing tennis at the age of 14 and the very next year won her first tournament, the New York State girls’ championship, sponsored by the American Tennis Association (ATA), which was organized in 1916 by black players as an alternative to the exclusively white USLTA. After prominent doctors and tennis enthusiasts Hubert Eaton and R. Walter Johnson took Gibson under their wing, she won her first of what would be 10 straight ATA championships in 1947.
In 1949, Gibson attempted to gain entry into the USLTA’s National Grass Court Championships at Forest Hills, the precursor of the U.S. Open. When the USLTA failed to invite her to any qualifying tournaments, Alice Marble–a four-time winner at Forest Hills–wrote a letter on Gibson’s behalf to the editor of American Lawn Tennis magazine. Marble criticized the “bigotry” of her fellow USLTA members, suggesting that if Gibson posed a challenge to current tour players, “it’s only fair that they meet this challenge on the courts.” Gibson was subsequently invited to participate in a New Jersey qualifying event, where she earned a berth at Forest Hills.
On August 28, 1950, Gibson beat Barbara Knapp 6-2, 6-2 in her first USLTA tournament match. She lost a tight match in the second round to Louise Brough, three-time defending Wimbledon champion. Gibson struggled over her first several years on tour but finally won her first major victory in 1956, at the French Open in Paris. She came into her own the following year, winning Wimbledon and the U.S. Open at the relatively advanced age of 30.
Gibson repeated at Wimbledon and the U.S. Open the next year but soon decided to retire from the amateur ranks and go pro. At the time, the pro tennis league was poorly developed, and Gibson at one point went on tour with the Harlem Globetrotters, playing tennis during halftime of their basketball games. In the early 1960s, Gibson became the first black player to compete on the women’s golf tour, though she never won a tournament. She was elected to the International Tennis Hall of Fame in 1971.
Though she once brushed off comparisons to Jackie Robinson, the trailblazing black baseball player, Gibson has been credited with paving the way for African-American tennis champions such as Arthur Ashe and, more recently, Venus and Serena Williams. After a long illness, she died in 2003 at the age of 76.
The modern United States receives its crowning star when President Dwight D. Eisenhower signs a proclamation admitting Hawaii into the Union as the 50th state. The president also issued an order for an American flag featuring 50 stars arranged in staggered rows: five six-star rows and four five-star rows. The new flag became official July 4, 1960.
The first known settlers of the Hawaiian Islands were Polynesian voyagers who arrived sometime in the eighth century. In the early 18th century, American traders came to Hawaii to exploit the islands’ sandalwood, which was much valued in China at the time. In the 1830s, the sugar industry was introduced to Hawaii and by the mid 19th century had become well established. American missionaries and planters brought about great changes in Hawaiian political, cultural, economic, and religious life. In 1840, a constitutional monarchy was established, stripping the Hawaiian monarch of much of his authority.
In 1893, a group of American expatriates and sugar planters supported by a division of U.S. Marines deposed Queen Liliuokalani, the last reigning monarch of Hawaii. One year later, the Republic of Hawaii was established as a U.S. protectorate with Hawaiian-born Sanford B. Dole as president. Many in Congress opposed the formal annexation of Hawaii, and it was not until 1898, following the use of the naval base at Pearl Harbor during the Spanish-American War, that Hawaii’s strategic importance became evident and formal annexation was approved. Two years later, Hawaii was organized into a formal U.S. territory. During World War II, Hawaii became firmly ensconced in the American national identity following the surprise Japanese attack on Pearl Harbor in December 1941.
In March 1959, the U.S. government approved statehood for Hawaii, and in June the Hawaiian people voted by a wide majority to accept admittance into the United States. Two months later, Hawaii officially became the 50th state.
On this day in 1911, a dispatcher in the New York Times office sends the first telegram around the world via commercial service. Exactly 66 years later, the National Aeronautics and Space Administration (NASA) sends a different kind of message–a phonograph record containing information about Earth for extraterrestrial beings–shooting into space aboard the unmanned spacecraft Voyager II.
The Times decided to send its 1911 telegram in order to determine how fast a commercial message could be sent around the world by telegraph cable. The message, reading simply “This message sent around the world,” left the dispatch room on the 17th floor of the Times building in New York at 7 p.m. on August 20. After it traveled more than 28,000 miles, being relayed by 16 different operators, through San Francisco, the Philippines, Hong Kong, Saigon, Singapore, Bombay, Malta, Lisbon and the Azores–among other locations–the reply was received by the same operator 16.5 minutes later. It was the fastest time achieved by a commercial cablegram since the opening of the Pacific cable in 1900 by the Commercial Cable Company.
On August 20, 1977, a NASA rocket launched Voyager II, an unmanned 1,820-pound spacecraft, from Cape Canaveral, Florida. It was the first of two such crafts to be launched that year on a “Grand Tour” of the outer planets, organized to coincide with a rare alignment of Jupiter, Saturn, Uranus and Neptune. Aboard Voyager II was a 12-inch copper phonograph record called “Sounds of Earth.” Intended as a kind of introductory time capsule, the record included greetings in 60 languages and scientific information about Earth and the human race, along with classical, jazz and rock ‘n’ roll music, nature sounds like thunder and surf, and recorded messages from President Jimmy Carter and other world leaders.
The brainchild of astronomer Carl Sagan, the record was sent with Voyager II and its twin craft, Voyager I–launched just two weeks later–in the faint hope that it might one day be discovered by extraterrestrial creatures. The record was sealed in an aluminum jacket that would keep it intact for 1 billion years, along with instructions on how to play the record, with a cartridge and needle provided.
More importantly, the two Voyager crafts were designed to explore the outer solar system and send information and photographs of the distant planets to Earth. Over the next 12 years, the mission proved a smashing success. After both crafts flew by Jupiter and Saturn, Voyager I went flying off towards the solar system’s edge while Voyager II visited Uranus, Neptune and finally Pluto in 1990 before sailing off to join its twin in the outer solar system.
Thanks to the Voyager program, NASA scientists gained a wealth of information about the outer planets, including close-up photographs of Saturn’s seven rings; evidence of active geysers and volcanoes exploding on some of the four planets’ 22 moons; winds of more than 1,500 mph on Neptune; and measurements of the magnetic fields on Uranus and Neptune. The two crafts are expected to continue sending data until 2020, or until their plutonium-based power sources run out. After that, they will continue to sail on through the galaxy for millions of years to come, barring some unexpected collision.