This Day in History
The 18th Amendment to the U.S. Constitution, prohibiting the “manufacture, sale, or transportation of intoxicating liquors for beverage purposes,” is ratified on this day in 1919 and becomes the law of the land.
The movement for the prohibition of alcohol began in the early 19th century, when Americans concerned about the adverse effects of drinking began forming temperance societies. By the late 19th century, these groups had become a powerful political force, campaigning on the state level and calling for total national abstinence. In December 1917, the 18th Amendment, also known as the Prohibition Amendment, was passed by Congress and sent to the states for ratification.
Prohibition took effect in January 1919. Nine months later, Congress passed the Volstead Act, or National Prohibition Act, over President Woodrow Wilson’s veto. The Volstead Act provided for the enforcement of prohibition, including the creation of a special unit of the Treasury Department. Despite a vigorous effort by law-enforcement agencies, the Volstead Act failed to prevent the large-scale distribution of alcoholic beverages, and organized crime flourished in America. In 1933, the 21st Amendment to the Constitution was passed and ratified, repealing prohibition.
On this day in 1967, at the Los Angeles Coliseum, the Green Bay Packers beat the Kansas City Chiefs in the first-ever world championship game of American football.
In the mid-1960s, the intense competition for players and fans between the National Football League (NFL) and the upstart American Football League (AFL) led to talks of a possible merger. It was decided that the winners of each league’s championship would meet each year in a single game to determine the “world champion of football.”
In that historic first game–played before a non-sell-out crowd of 61,946 people–Green Bay scored three touchdowns in the second half to defeat Kansas City 35-10. Led by MVP quarterback Bart Starr, the Packers benefited from Max McGee’s stellar receiving and a key interception by safety Willie Wood. For their win, each member of the Packers collected $15,000: the largest single-game share in the history of team sports.
Postseason college games were known as “bowl” games, and AFL founder Lamar Hunt suggested that the new pro championship be called the “Super Bowl.” The term was officially introduced in 1969, along with roman numerals to designate the individual games. In 1970, the NFL and AFL merged into one league with two conferences, each with 13 teams. Since then, the Super Bowl has been a face-off between the winners of the American Football Conference (AFC) and the National Football Conference (NFC) for the NFL championship and the coveted Vince Lombardi Trophy, named for the legendary Packers coach who guided his team to victory in the first two Super Bowls.
Super Bowl Sunday has become an unofficial American holiday, complete with parties, betting pools and excessive consumption of food and drink. On average, 80 to 90 million people are tuned into the game on TV at any given moment, while some 130-140 million watch at least some part of the game. The commercials shown during the game have become an attraction in themselves, with TV networks charging as much as $2.5 million for a 30-second spot and companies making more expensive, high-concept ads each year. The game itself has more than once been upstaged by its elaborate pre-game or halftime entertainment, most recently in 2004 when Janet Jackson’s infamous “wardrobe malfunction” resulted in a $225,000 fine for the TV network airing the game, CBS, and tighter controls on televised indecency.
The theologian, musician, philosopher and Nobel Prize-winning physician Albert Schweitzer is born on this day in 1875 in Upper-Alsace, Germany (now Haut-Rhin, France).
The son and grandson of ministers, Schweitzer studied theology and philosophy at the universities of Strasbourg, Paris and Berlin. After working as a pastor, he entered medical school in 1905 with the dream of becoming a missionary in Africa. Schweitzer was also an acclaimed concert organist who played professional engagements to earn money for his education. By the time he received his M.D. in 1913, the overachieving Schweitzer had published several books, including the influential The Quest for the Historical Jesus and a book on the composer Johann Sebastian Bach.
Medical degree in hand, Schweitzer and his wife, Helene Bresslau, moved to French Equatorial Africa where he founded a hospital at Lambarene (modern-day Gabon). When World War I broke out, the German-born Schweitzers were sent to a French internment camp as prisoners of war. Released in 1918, they returned to Lambarene in 1924. Over the next three decades, Schweitzer made frequent visits to Europe to lecture on culture and ethics. His philosophy revolved around the concept of what he called “reverence for life”–the idea that all life must be respected and loved, and that humans should enter into a personal, spiritual relationship with the universe and all its creations. This reverence for life, according to Schweitzer, would naturally lead humans to live a life of service to others.
Schweitzer won widespread praise for putting his uplifting theory into practice at his hospital in Africa, where he treated many patients with leprosy and the dreaded African sleeping sickness. Awarded the Nobel Peace Prize for 1952, Schweitzer used his $33,000 award to start a leprosarium at Lambarene. From the early 1950s until his death in 1965, Schweitzer spoke and wrote tirelessly about his opposition to nuclear tests and nuclear weapons, adding his voice to those of fellow Nobelists Albert Einstein and Bertrand Russell.
On this day in 1128, Pope Honorius II grants a papal sanction to the military order known as the Knights Templar, declaring it to be an army of God.
Led by the Frenchman Hughes de Payens, the Knights Templar organization was founded in 1118. Its self-imposed mission was to protect Christian pilgrims on their way to the Holy Land during the Crusades, the series of military expeditions aimed at defeating Muslims in Palestine. The Templars took their name from the location of their headquarters, at Jerusalem’s Temple Mount. For a while, the Templars had only nine members, mostly due to their rigid rules. In addition to having noble birth, the knights were required to take strict vows of poverty, obedience and chastity. In 1127, new promotional efforts convinced many more noblemen to join the order, gradually increasing its size and influence.
While the individual knights were not allowed to own property, there was no such restriction on the organization as a whole, and over the years many rich Christians gave gifts of land and other valuables to support the Knights Templar. By the time the Crusades ended unsuccessfully in the early 14th century, the order had grown extremely wealthy, provoking the jealousy of both religious and secular powers. In 1307, King Philip IV of France and Pope Clement V combined to take down the Knights Templar, arresting the grand master, Jacques de Molay, on charges of heresy, sacrilege and Satanism. Under torture, Molay and other leading Templars confessed and were eventually burned at the stake. Clement dissolved the Templars in 1312, assigning their property and monetary assets to a rival order, the Knights Hospitalers. In fact, though, Philip and his English counterpart, King Edward II, claimed most of the wealth after banning the organization from their respective countries.
The modern-day Catholic Church has admitted that the persecution of the Knights Templar was unjustified and claimed that Pope Clement was pressured by secular rulers to dissolve the order. Over the centuries, myths and legends about the Templars have grown, including the belief that they may have discovered holy relics at Temple Mount, including the Holy Grail, the Ark of the Covenant or parts of the cross from Christ’s crucifixion. The imagined secrets of the Templars have inspired various books and movies, including the blockbuster novel and film The Da Vinci Code.
On this day in 1926, the two-man comedy series “Sam ‘n’ Henry” debuts on Chicago’s WGN radio station. Two years later, after changing its name to “Amos ‘n’ Andy,” the show became one of the most popular radio programs in American history.
Though the creators and the stars of the new radio program, Freeman Gosden and Charles Carrell, were both white, the characters they played were two black men from the Deep South who moved to Chicago to seek their fortunes. By that time, white actors performing in dark stage makeup–or “blackface”–had been a significant tradition in American theater for over 100 years. Gosden and Carrell, both vaudeville performers, were doing a Chicago comedy act in blackface when an employee at the Chicago Tribune suggested they create a radio show.
When “Sam ‘n’ Henry” debuted in January 1926, it became an immediate hit. In 1928, Gosden and Carrell took their act to a rival station, the Chicago Daily News’ WMAQ. When they discovered WGN owned the rights to their characters’ names, they simply changed them. As their new contract gave Gosden and Carrell the right to syndicate the program, the popularity of “Amos ‘n’ Andy” soon exploded. Over the next 22 years, the show would become the highest-rated comedy in radio history, attracting more than 40 million listeners.
By 1951, when “Amos ‘n’ Andy” came to television, changing attitudes about race and concerns about racism had virtually wiped out the practice of blackface. With Alvin Childress and Spencer Williams taking over for Gosden and Carrell, the show was the first TV series to feature an all-black cast and the only one of its kind for the next 20 years. This did not stop African-American advocacy groups and eventually the National Association for the Advancement of Colored People (NAACP) from criticizing both the radio and TV versions of “Amos ‘n’ Andy” for promoting racial stereotypes. These protests led to the TV show’s cancellation in 1953.
The final radio broadcast of “Amos ‘n’ Andy” aired on November 25, 1960. The following year, Gosden and Carrell created a short-lived TV sequel called “Calvin and the Colonel.” This time, they avoided controversy by replacing the human characters with an animated fox and bear. The show was canceled after one season.
On January 11, 1908, U.S. President Theodore Roosevelt declares the massive Grand Canyon in northwestern Arizona a national monument.
Though Native Americans lived in the area as early as the 13th century, the first European sighting of the canyon wasn’t until 1540, by members of an expedition headed by the Spanish explorer Francisco Vasquez de Coronado. Because of its remote and inaccessible location, several centuries passed before North American settlers really explored the canyon. In 1869, geologist John Wesley Powell led a group of 10 men in the first difficult journey down the rapids of the Colorado River and along the length of the 277-mile gorge in four rowboats.
By the end of the 19th century, the Grand Canyon was attracting thousands of tourists each year. One famous visitor was President Theodore Roosevelt, a New Yorker with a particular affection for the American West.After becoming president in1901 after the assassination of President William McKinley, Roosevelt made environmental conservation a major part of his presidency. After establishing the National Wildlife Refuge to protect the country’s animals, fish and birds, Roosevelt turned his attention to federal regulation of public lands. Though a region could be given national park status–indicating that all private development on that land was illegal–only by an act of Congress, Roosevelt cut down on red tape by beginning a new presidential practice of granting a similar “national monument” designation to some of the West’s greatest treasures.
In January 1908, Roosevelt exercised this right to make more than 800,000 acres of the Grand Canyon area into a national monument. “Let this great wonder of nature remain as it now is,” he declared. “You cannot improve on it. But what you can do is keep it for your children, your children’s children, and all who come after you, as the one great sight which every American should see.”
Congress did not officially outlaw private development in the Grand Canyon until 1919, when President Woodrow Wilson signed the Grand Canyon National Park Act. Today, more than 5 million people visit the canyon each year. The canyon floor is accessible by foot, mule or boat, and whitewater rafting, hiking and running in the area are especially popular. Many choose to conserve their energies and simply take in the breathtaking view from the canyon’s South Rim–some 7,000 feet above sea level–and marvel at a vista virtually unchanged for over 400 years.
On this day in 1901, a drilling derrick at Spindletop Hill near Beaumont, Texas, produces an enormous gusher of crude oil, coating the landscape for hundreds of feet and signaling the advent of the American oil industry. The geyser was discovered at a depth of over 1,000 feet, flowed at an initial rate of approximately 100,000 barrels a day and took nine days to cap. Following the discovery, petroleum, which until that time had been used in the U.S. primarily as a lubricant and in kerosene for lamps, would become the main fuel source for new inventions such as cars and airplanes; coal-powered forms of transportation including ships and trains would also convert to the liquid fuel.
Crude oil, which became the world’s first trillion-dollar industry, is a natural mix of hundreds of different hydrocarbon compounds trapped in underground rock. The hydrocarbons were formed millions of years ago when tiny aquatic plants and animals died and settled on the bottoms of ancient waterways, creating a thick layer of organic material. Sediment later covered this material, putting heat and pressure on it and transforming it into the petroleum that comes out of the ground today.
In the early 1890s, Texas businessman and amateur geologist Patillo Higgins became convinced there was a large pool of oil under a salt-dome formation south of Beaumont. He and several partners established the Gladys City Oil, Gas and Manufacturing Company and made several unsuccessful drilling attempts before Higgins left the company. In 1899, Higgins leased a tract of land at Spindletop to mining engineer Anthony Lucas. The Lucas gusher blew on January 10, 1901, and ushered in the liquid fuel age. Unfortunately for Higgins, he’d lost his ownership stake by that point.
Beaumont became a “black gold” boomtown, its population tripling in three months. The town filled up with oil workers, investors, merchants and con men (leading some people to dub it “Swindletop”). Within a year, there were more than 285 actives wells at Spindletop and an estimated 500 oil and land companies operating in the area, including some that are major players today: Humble (now Exxon), the Texas Company (Texaco) and Magnolia Petroleum Company (Mobil).
Spindletop experienced a second boom starting in the mid-1920s when more oil was discovered at deeper depths. In the 1950s, Spindletop was mined for sulphur. Today, only a few oil wells still operate in the area.
On this day in 1493, Italian explorer Christopher Columbus, sailing near the Dominican Republic, sees three “mermaids”–in reality manatees–and describes them as “not half as beautiful as they are painted.” Six months earlier, Columbus (1451-1506) set off from Spain across the Atlantic Ocean with the Nina, Pinta and Santa Maria, hoping to find a western trade route to Asia. Instead, his voyage, the first of four he would make, led him to the Americas, or “New World.”
Mermaids, mythical half-female, half-fish creatures, have existed in seafaring cultures at least since the time of the ancient Greeks. Typically depicted as having a woman’s head and torso, a fishtail instead of legs and holding a mirror and comb, mermaids live in the ocean and, according to some legends, can take on a human shape and marry mortal men. Mermaids are closely linked to sirens, another folkloric figure, part-woman, part-bird, who live on islands and sing seductive songs to lure sailors to their deaths.
Mermaid sightings by sailors, when they weren’t made up, were most likely manatees, dugongs or Steller’s sea cows (which became extinct by the 1760s due to over-hunting). Manatees are slow-moving aquatic mammals with human-like eyes, bulbous faces and paddle-like tails. It is likely that manatees evolved from an ancestor they share with the elephant. The three species of manatee (West Indian, West African and Amazonian) and one species of dugong belong to the Sirenia order. As adults, they’re typically 10 to 12 feet long and weigh 800 to 1,200 pounds. They’re plant-eaters, have a slow metabolism and can only survive in warm water.
Manatees live an average of 50 to 60 years in the wild and have no natural predators. However, they are an endangered species. In the U.S., the majority of manatees are found in Florida, where scores of them die or are injured each year due to collisions with boats.
On this day in 1877, Crazy Horse and his warriors–outnumbered, low on ammunition and forced to use outdated weapons to defend themselves–fight their final losing battle against the U.S. Cavalry in Montana.
Six months earlier, in the Battle of Little Bighorn, Crazy Horse and his ally, Chief Sitting Bull, led their combined forces of Sioux and Cheyenne to a stunning victory over Lieutenant Colonel George Custer (1839-76) and his men. The Indians were resisting the U.S. government’s efforts to force them back to their reservations. After Custer and over 200 of his soldiers were killed in the conflict, later dubbed “Custer’s Last Stand,” the American public wanted revenge. As a result, the U.S. Army launched a winter campaign in 1876-77, led by General Nelson Miles (1839-1925), against the remaining hostile Indians on the Northern Plains.
Combining military force with diplomatic overtures, Nelson convinced many Indians to surrender and return to their reservations. Much to Nelson’s frustration, though, Sitting Bull refused to give in and fled across the border to Canada, where he and his people remained for four years before finally returning to the U.S. to surrender in 1881. Sitting Bull died in 1890. Meanwhile, Crazy Horse and his band also refused to surrender, even though they were suffering from illness and starvation.
On January 8, 1877, General Miles found Crazy Horse’s camp along Montana’s Tongue River. U.S. soldiers opened fire with their big wagon-mounted guns, driving the Indians from their warm tents out into a raging blizzard. Crazy Horse and his warriors managed to regroup on a ridge and return fire, but most of their ammunition was gone, and they were reduced to fighting with bows and arrows. They managed to hold off the soldiers long enough for the women and children to escape under cover of the blinding blizzard before they turned to follow them.
Though he had escaped decisive defeat, Crazy Horse realized that Miles and his well-equipped cavalry troops would eventually hunt down and destroy his cold, hungry followers. On May 6, 1877, Crazy Horse led approximately 1,100 Indians to the Red Cloud reservation near Nebraska’s Fort Robinson and surrendered. Five months later, a guard fatally stabbed him after he allegedly resisted imprisonment by Indian policemen.
In 1948, American sculptor Korczak Ziolkowski began work on the Crazy Horse Memorial, a massive monument carved into a mountain in South Dakota. Still a work in progress, the monument will stand 641 feet high and 563 feet long when completed.
On this day in 1789, America’s first presidential election is held. Voters cast ballots to choose state electors; only white men who owned property were allowed to vote. As expected, George Washington won the election and was sworn into office on April 30, 1789.
As it did in 1789, the United States still uses the Electoral College system, established by the U.S. Constitution, which today gives all American citizens over the age of 18 the right to vote for electors, who in turn vote for the president. The president and vice president are the only elected federal officials chosen by the Electoral College instead of by direct popular vote.
Today political parties usually nominate their slate of electors at their state conventions or by a vote of the party’s central state committee, with party loyalists often being picked for the job. Members of the U.S. Congress, though, can’t be electors. Each state is allowed to choose as many electors as it has senators and representatives in Congress. The District of Columbia has 3 electors. During a presidential election year, on Election Day (the first Tuesday after the first Monday in November), the electors from the party that gets the most popular votes are elected in a winner-take-all-system, with the exception of Maine and Nebraska, which allocate electors proportionally. In order to win the presidency, a candidate needs a majority of 270 electoral votes out of a possible 538.
On the first Monday after the second Wednesday in December of a presidential election year, each state’s electors meet, usually in their state capitol, and simultaneously cast their ballots nationwide. This is largely ceremonial: Because electors nearly always vote with their party, presidential elections are essentially decided on Election Day. Although electors aren’t constitutionally mandated to vote for the winner of the popular vote in their state, it is demanded by tradition and required by law in 26 states and the District of Columbia (in some states, violating this rule is punishable by $1,000 fine). Historically, over 99 percent of all electors have cast their ballots in line with the voters. On January 6, as a formality, the electoral votes are counted before Congress and on January 20, the commander in chief is sworn into office.
Critics of the Electoral College argue that the winner-take-all system makes it possible for a candidate to be elected president even if he gets fewer popular votes than his opponent. This happened in the elections of 1876, 1888 and 2000. However, supporters contend that if the Electoral College were done away with, heavily populated states such as California and Texas might decide every election and issues important to voters in smaller states would be ignored.
On this day in 1838, Samuel Morse’s telegraph system is demonstrated for the first time at the Speedwell Iron Works in Morristown, New Jersey. The telegraph, a device which used electric impulses to transmit encoded messages over a wire, would eventually revolutionize long-distance communication, reaching the height of its popularity in the 1920s and 1930s.
Samuel Finley Breese Morse was born April 27, 1791, in Charlestown, Massachusetts. He attended Yale University, where he was interested in art, as well as electricity, still in its infancy at the time. After college, Morse became a painter. In 1832, while sailing home from Europe, he heard about the newly discovered electromagnet and came up with an idea for an electric telegraph. He had no idea that other inventors were already at work on the concept.
Morse spent the next several years developing a prototype and took on two partners, Leonard Gale and Alfred Vail, to help him. In 1838, he demonstrated his invention using Morse code, in which dots and dashes represented letters and numbers. In 1843, Morse finally convinced a skeptical Congress to fund the construction of the first telegraph line in the United States, from Washington, D.C., to Baltimore. In May 1844, Morse sent the first official telegram over the line, with the message: “What hath God wrought!”
Over the next few years, private companies, using Morse’s patent, set up telegraph lines around the Northeast. In 1851, the New York and Mississippi Valley Printing Telegraph Company was founded; it would later change its name to Western Union. In 1861, Western Union finished the first transcontinental line across the United States. Five years later, the first successful permanent line across the Atlantic Ocean was constructed and by the end of the century telegraph systems were in place in Africa, Asia and Australia.
Because telegraph companies typically charged by the word, telegrams became known for their succinct prose–whether they contained happy or sad news. The word “stop,” which was free, was used in place of a period, for which there was a charge. In 1933, Western Union introduced singing telegrams. During World War II, Americans came to dread the sight of Western Union couriers because the military used telegrams to inform families about soldiers’ deaths.
Over the course of the 20th century, telegraph messages were largely replaced by cheap long-distance phone service, faxes and email. Western Union delivered its final telegram in January 2006.
Samuel Morse died wealthy and famous in New York City on April 2, 1872, at age 80.
On January 5, 1933, construction begins on the Golden Gate Bridge, as workers began excavating 3.25 million cubic feet of dirt for the structure’s huge anchorages.
Following the Gold Rush boom that began in 1849, speculators realized the land north of San Francisco Bay would increase in value in direct proportion to its accessibility to the city. Soon, a plan was hatched to build a bridge that would span the Golden Gate, a narrow, 400-foot deep strait that serves as the mouth of the San Francisco Bay, connecting the San Francisco Peninsula with the southern end of Marin County.
Although the idea went back as far as 1869, the proposal took root in 1916. A former engineering student, James Wilkins, working as a journalist with the San Francisco Bulletin, called for a suspension bridge with a center span of 3,000 feet, nearly twice the length of any in existence. Wilkins’ idea was estimated to cost an astounding $100 million. So, San Francisco’s city engineer, Michael M. O’Shaughnessy (he’s also credited with coming up with the name Golden Gate Bridge), began asking bridge engineers whether they could do it for less.
Engineer and poet Joseph Strauss, a 5-foot tall Cincinnati-born Chicagoan, said he could.
Eventually, O’Shaughnessy and Strauss concluded they could build a pure suspension bridge within a practical range of $25-30 million with a main span at least 4,000 feet. The construction plan still faced opposition, including litigation, from many sources. By the time most of the obstacles were cleared, the Great Depression of 1929 had begun, limiting financing options, so officials convinced voters to support $35 million in bonded indebtedness, citing the jobs that would be created for the project. However, the bonds couldn’t be sold until 1932, when San-Francisco based Bank of America agreed to buy the entire project in order to help the local economy.
The Golden Gate Bridge officially opened on May 27, 1937, the longest bridge span in the world at the time. The first public crossing had taken place the day before, when 200,000 people walked, ran and even roller skated over the new bridge.
With its tall towers and famous red paint job, the bridge quickly became a famous American landmark, and a symbol of San Francisco.
On this day in 1999, for the first time since Charlemagne’s reign in the ninth century, Europe is united with a common currency when the “euro” debuts as a financial unit in corporate and investment markets. Eleven European Union (EU) nations (Austria, Belgium, Finland, France, Germany, Ireland, Italy, Luxembourg, the Netherlands, Portugal and Spain), representing some 290 million people, launched the currency in the hopes of increasing European integration and economic growth. Closing at a robust 1.17 U.S. dollars on its first day, the euro promised to give the dollar a run for its money in the new global economy. Euro cash, decorated with architectural images, symbols of European unity and member-state motifs, went into circulation on January 1, 2002, replacing the Austrian schilling, Belgian franc, Finnish markka, French franc, German mark, Italian lira, Irish punt, Luxembourg franc, Netherlands guilder, Portugal escudo and Spanish peseta. A number of territories and non-EU nations including Monaco and Vatican City also adopted the euro.
Conversion to the euro wasn’t without controversy. Despite the practical benefits of a common currency that would make it easier to do business and travel throughout Europe, there were concerns that the changeover process would be costly and chaotic, encourage counterfeiting, lead to inflation and cause individual nations to loose control over their economic policies. Great Britain, Sweden and Demark opted not to use the euro. Greece, after initially being excluded for failing to meet all the required conditions, adopted the euro in January 2001, becoming the 12th member of the so-called eurozone.
The euro was established by the 1992 Maastricht Treaty on European Union, which spelled out specific economic requirements, including high degree of price stability and low inflation, which countries must meet before they can begin using the new money. The euro consists of 8 coins and 7 paper bills. The Frankfurt-based European Central Bank (ECB) manages the euro and sets interest rates and other monetary policies. In 2004, 10 more countries joined the EU—-Cyprus, Czech Republic, Estonia, Hungary, Latvia, Lithuania, Malta, Poland, Slovakia and Slovenia. Several of these countries plan to start using the euro in 2007, with the rest to follow in coming years.
On this day in 1990, Panama’s General Manuel Antonio Noriega, after holing up for 10 days at the Vatican embassy in Panama City, surrenders to U.S. military troops to face charges of drug trafficking. Noriega was flown to Miami the following day and crowds of citizens on the streets of Panama City rejoiced. On July 10, 1992, the former dictator was convicted of drug trafficking, money laundering and racketeering and sentenced to 40 years in prison.
Noriega, who was born in Panama in 1938, was a loyal soldier to General Omar Torrijos, who seized power in a 1968 coup. Under Torrijos, Noriega headed up the notorious G-2 intelligence service, which harassed and terrorized people who criticized the Torrijos regime. Noriega also became a C.I.A. operative, while at the same time getting rich smuggling drugs.
In 1981, Omar Torrijos died in a plane crash and after a two-year power struggle, Noriega emerged as general of Panama’s military forces. He became the country’s de facto leader, fixing presidential elections so he could install his own puppet officials. Noriega’s rule was marked by corruption and violence. He also became a double agent, selling American intelligence secrets to Cuba and Eastern European governments. In 1987, when Panamanians organized protests against Noriega and demanded his ouster, he declared a national emergency, shut down radio stations and newspapers and forced his political enemies into exile.
That year the United States cut off aid to Panama and tried to get Noriega to resign; in 1988, the U.S. began considering the use of military action to put an end to his drug trafficking. Noriega voided the May 1989 presidential election, which included a U.S.-backed candidate, and in December of that year he declared his country to be in a state of war with the United States. Shortly afterward, an American marine was killed by Panamanian soldiers. President George H.W. Bush authorized “Operation Just Cause,” and on December 20, 1989, 13,000 U.S. troops were sent to occupy Panama City, along with the 12,000 already there, and seize Noriega. During the invasion, 23 U.S. troops were killed in action and over 300 were wounded. Approximately 450 Panamanian troops were killed; estimates for the number of civilians who died range from several hundred to several thousand, with thousands injured.
Today, Noriega, derogatorily nicknamed “Pineapple Face” in reference to his pockmarked skin, is serving his sentence at a federal prison in Miami.