Search for ‘Louis Menand’ (3 articles found)

M.A.D. city

In 1956, during the Suez crisis, Nikita Khrushchev threatened to attack London and Paris with missiles if Britain and France did not withdraw their forces from Egypt. And, in 1969, Richard Nixon ordered B-52s armed with hydrogen bombs to fly routes up and down the coast of the Soviet Union—part of his “madman theory,” a strategy intended to get the North Vietnamese to believe that he was capable of anything, and to negotiate for peace. (The madman strategy was no more effective than anything else the United States tried, short of withdrawal, in the hope of bringing an end to the Vietnam War.)

But most of the danger that human beings faced from nuclear weapons after the destruction of Hiroshima and Nagasaki had to do with inadvertence—with bombs dropped by mistake, bombers catching on fire or crashing, missiles exploding, and computers miscalculating and people jumping to the wrong conclusion. On most days, the probability of a nuclear explosion happening by accident was far greater than the probability that someone would deliberately start a war.

In the early years of the Cold War, many of these accidents involved airplanes. In 1958, for example, a B-47 bomber carrying a Mark 36 hydrogen bomb, one of the most powerful weapons in the American arsenal, caught fire while taxiing on a runway at an airbase in Morocco. The plane split in two, the base was evacuated, and the fire burned for two and a half hours. But the explosives in the warhead didn’t detonate; that would have set off a chain reaction. Although the King of Morocco was informed, the accident was otherwise kept a secret.

Six weeks later, a Mark 6 landed in the back yard of a house in Mars Bluff, South Carolina. It had fallen when a crewman mistakenly grabbed the manual bomb-release lever. The nuclear core had not been inserted, but the explosives detonated, killing a lot of chickens, sending members of the family to the hospital, and leaving a thirty-five-foot crater. Although it was impossible to keep that event a secret, the Strategic Air Command (SAC), which controlled the airborne nuclear arsenal, informed the public that the incident was the first of its kind. In fact, the previous year, a hydrogen bomb, also without a core, had been accidentally released near Albuquerque and exploded on impact.

Soon after the successful Soviet launch of Sputnik, in 1957, missiles became the preferred delivery vehicle for nuclear warheads, but scary things kept happening. In 1960, the computer at the North American Air Defense Command (NORAD) in Colorado Springs warned, with 99.9-per-cent certainty, that the Soviets had just launched a full-scale missile attack against North America. The warheads would land within minutes. When it was learned that Khrushchev was in New York City, at the United Nations, and when no missiles landed, officials concluded that the warning was a false alarm. They later discovered that the Ballistic Missile Early Warning System at Thule Airbase, in Greenland, had interpreted the moon rising over Norway as a missile attack from Siberia.

In 1979, NORAD’s computer again warned of an all-out Soviet attack. Bombers were manned, missiles were placed on alert, and air-traffic controllers notified commercial aircraft that they might soon be ordered to land. An investigation revealed that a technician had mistakenly put a war-games tape, intended as part of a training exercise, into the computer. A year later, it happened a third time: Zbigniew Brzezinski, the national-security adviser, was called at home at two-thirty in the morning and informed that two hundred and twenty missiles were on their way toward the United States. That false alarm was the fault of a defective computer chip that cost forty-six cents.

A study run by Sandia National Laboratories, which oversees the production and security of American nuclear-weapons systems, discovered that between 1950 and 1968 at least twelve hundred nuclear weapons had been involved in “significant” accidents. Even bombs that worked didn’t work quite as planned. In Little Boy, the bomb dropped on Hiroshima on August 6, 1945, only 1.38 per cent of the nuclear core, less than a kilogram of uranium, fissioned (although the bomb killed eighty thousand people). The bomb dropped on Nagasaki, three days later, was a mile off target (and killed forty thousand people). A test of the hydrogen bomb in the Bikini atoll, in 1954, produced a yield of fifteen megatons, three times as great as scientists had predicted, and spread lethal radioactive fallout over hundreds of square miles in the Pacific, some of it affecting American observers miles away from the blast site.

These stories, and many more, can be found in Eric Schlosser’s “Command and Control” (Penguin), an excellent journalistic investigation of the efforts made since the first atomic bomb was exploded, outside Alamogordo, New Mexico, on July 16, 1945, to put some kind of harness on nuclear weaponry. By a miracle of information management, Schlosser has synthesized a huge archive of material, including government reports, scientific papers, and a substantial historical and polemical literature on nukes, and transformed it into a crisp narrative covering more than fifty years of scientific and political change. And he has interwoven that narrative with a hair-raising, minute-by-minute account of an accident at a Titan II missile silo in Arkansas, in 1980, which he renders in the manner of a techno-thriller:

Plumb watched the nine-pound socket slip through the narrow gap between the platform and the missile, fall about seventy feet, hit the thrust mount, and then ricochet off the Titan II. It seemed to happen in slow motion. A moment later, fuel sprayed from a hole in the missile like water from a garden hose.

“Oh man,” Plumb thought. “This is not good.”


* * *

…The Arkansas incident, in 1980, is well chosen as an illustration of Schlosser’s point. Objects fall inside silos all the time, he says. The chance that a falling socket would puncture the skin of a Titan II missile was extremely remote—but not impossible. When it happened, it triggered a set of mechanical and human responses that quickly led to a nightmare of confusion and misdirection. Once enough oxidizer leaked out and the air pressure inside the tank dropped, the missile would collapse, the remaining oxidizer would come into contact with the rocket fuel, and the missile would explode. Because a nineteen-year-old airman performing regular maintenance accidentally let a socket slip out of his wrench, a Titan II missile became a time bomb, and there was no way to turn off the timer.

And the missile was armed. Schlosser says that the explosive force of the warhead on a Titan II is nine megatons, which is three times the force of all the bombs dropped in the Second World War, including the atomic bombs that destroyed Hiroshima and Nagasaki. If it had detonated, most of the state of Arkansas would have been wiped out.

Few systems are more tightly coupled than the arsenal controlled by the nuclear football. Once the launch codes are entered, a chain of events is set in motion that is almost impossible to interrupt. The “Dr. Strangelove” scenario is quite realistic. The American nuclear-war plan, known as the Single Integrated Operational Plan (SIOP), provided for only one kind of response to an attack: full-scale nuclear war. It was assumed that tens of millions of people would die. There were no post-attack plans. For forty years, this was the American nuclear option. No doubt, the Soviets’ was identical.

Henry Kissinger called the SIOP a “horror strategy.” Even Nixon was appalled by it. Schlosser says that when General George Butler became the head of the Strategic Air Command, in 1991, and read the SIOP he was stunned. “This was the single most absurd and irresponsible document I had ever reviewed in my life,” he told Schlosser. “I came to fully appreciate the truth. We escaped the Cold War without a nuclear holocaust by some combination of skill, luck, and divine intervention, and I suspect the latter in greatest proportion.”

Hollywood in the Fifties

Can’t really stomach Preminger but otherwise, great list …

Take, for instance, Alfred Hitchcock’s run of films from “Strangers on a Train” and “Rear Window” through “The Wrong Man,” “Vertigo,” and “Psycho”; films by Nicholas Ray, including “In a Lonely Place,” “Johnny Guitar,” “Rebel Without a Cause,” and “Bigger than Life“; Anthony Mann’s run of Westerns with James Stewart, plus his “Man of the West” and the graceful, poignant “The Glenn Miller Story“; Douglas Sirk‘s melodramas, including “All that Heaven Allows,” “Magnificent Obsession,” and “There’s Always Tomorrow“; Fritz Lang’s later works, such as “Human Desire,” “The Big Heat,” and “While the City Sleeps.” Otto Preminger created such sharp, ambivalent treasures as “Angel Face,” “The Man with the Golden Arm,” “Bonjour Tristesse,” and “Anatomy of a Murder”; Allan Dwan, who started in 1911, was still in business, making such jarring films as “Slightly Scarlet“ and “Tennessee’s Partner.” Ida Lupino made incisive melodramas, including “The Bigamist“; Jacques Tourneur made “Stars in My Crown“; Robert Aldrich made the ultimate film noir, “Kiss Me Deadly“; Joseph L. Mankiewicz made such sharp and discerning films as “All About Eve,” the medical comedy “People Will Talk,” and the inside-Hollywood melodrama “The Barefoot Contessa,” and got Brando and Jean Simmons to sing, splendidly, in “Guys and Dolls.”

There were Westerns, films noirs, and bullfighting dramas by Budd Boetticher; there were exhilaratingly violent yet tender-hearted films by Samuel Fuller, such as “Pickup on South Street“ and “Park Row” and “Forty Guns”; there were the great comedies of Frank Tashlin (”Susan Slept Here,” “Will Success Spoil Rock Hunter?“), the many musicals, melodramas, and comedies of Vincente Minnelli (“An American in Paris,” “The Bad and the Beautiful,” “The Cobweb,” “Designing Woman,” “Some Came Running”), the musicals of Stanley Donen (“Singin’ in the Rain,” “The Pajama Game“). John Ford had a run of masterworks that included “The Sun Shines Bright,” “The Quiet Man,” “The Searchers,” and, in 1962, he made the greatest American political film, “The Man Who Shot Liberty Valance“; in 1960, Jerry Lewis got his start as a director with the wildly inventive “The Bellboy“; Orson Welles made “Mr. Arkadin“ and “Touch of Evil”; and Howard Hawks bracketed the decade with, at one end, “Monkey Business“ and “The Big Sky” and, at the other, “Rio Bravo.”

[Louis Menand on Dwight Macdonald:]

Before 1962, an educated cultural consumer might understandably have concluded that there was not much in the world of popular entertainment that demanded serious attention. Hollywood was in the doldrums…

This is simply not so. The nineteen-fifties were, rather, something of a golden age of American cinema. It was a time when Hollywood directors, liberated aesthetically by the example of Orson Welles and practically by the court-mandated rise of independent producers, let loose with a profusion of widely varied works of remarkable emotional and visual audacity and originality.

The battles for late night

And yet men have always gone to war over late night. The “Tonight Show” is like the Tudor dynasty—from the beginning, nothing but succession troubles. The man who made it all matter—for Leno, Letterman, and O’Brien—was Johnny Carson. Carson had himself replaced a television legend, Jack Paar. In its earliest incarnations, “Tonight” had been routinely clobbered in its time slot by old movies, but in 1957 Paar took custody of the show and turned it into a reliable source of revenue for NBC. Carson’s bona fides were somewhat sketchy. His own variety program, “The Johnny Carson Show,” had been cancelled, after a single season, in 1956. When NBC offered him “Tonight,” in 1962, he was hosting a daytime quiz show on ABC called “Who Do You Trust?,” a knockoff of Groucho Marx’s long-running “You Bet Your Life.”

NBC had considered Groucho—also Bob Newhart, Jackie Gleason, and Joey Bishop—as the host for “Tonight” before approaching Carson. But he was a hit from the start. By the end of his first year, he was drawing an average of seven and a half million viewers, twice the size of Paar’s audience. And the pie just kept on growing. By 1965, the “Tonight Show” was reported to be out-earning NBC’s entire prime-time schedule.

This got the attention of the other networks. The “Tonight Show” was a programming novelty. Most of the early television executives came from radio, where late night had never inspired much advertiser interest. In the nineteen-fifties, some television stations simply played “The Star-Spangled Banner” after the eleven-o’clock news and went off the air. The man of vision in this area was Sylvester (Pat) Weaver, the vice-president and later the president of NBC. Seeing opportunity in the fringes, he created both “Today,” in 1952, with Dave Garroway as the host, and “Tonight,” in 1954, with Steve Allen. (He was also one half of the team that brought us Sigourney Weaver.)

Weaver was a Dartmouth philosophy major; his boss, the redoubtable David Sarnoff, president of RCA, which owned NBC, had never gone to college. Weaver believed that television could be “an enlightenment machine.” He fought against what he called “the robotry of habit viewing,” and opposed giving the schedule over to soap operas and situation comedies, generic staples of radio. He invented the special: his theory was that a large number of people will tune in to a program if there is buzz about it. The theory seemed to be paying off—NBC’s broadcast of the Broadway “Peter Pan,” with Mary Martin, attracted sixty-five million viewers, almost forty per cent of the entire population—when, in 1955, Sarnoff removed Weaver from the presidency to make room for his son. For the next twenty years, CBS, which had no compunctions about stuffing its schedule full of soap operas and situation comedies, ate NBC’s lunch.

The theory at CBS was the reverse of Weaver’s. It was that people don’t watch programs; they watch television. The job is only to have your show be the one they end up watching once they turn the set on. They don’t have to feel good about themselves for watching your show; they don’t even have to like it. What Weaver deprecated as “habit viewing” was just what CBS was looking to exploit. At NBC, the policy was referred to, contemptuously, as LOP—Least Objectionable Program—but it worked. CBS broadcast, in prime time, “I Love Lucy,” “Mister Ed,” “My Favorite Martian,” “Gilligan’s Island,” “The Beverly Hillbillies,” and “The Munsters,” all of them about as dumb as they come, and all of them huge hits. From 1962 to 1964, fifty-seven million Americans tuned in to watch “The Beverly Hillbillies” every week.

Neither CBS nor ABC gave much thought to late night until Carson showed the entertainment world that there was gold in those distant hills. They had been earning little or nothing after eleven o’clock, generally giving the time to their affiliates, who sold the commercials and ran old movies. In 1964, though, CBS and ABC started looking for talent to go up against Carson and NBC. Between 1964 and 1972, a number of men were pushed into the arena. One of them was Dick Cavett.