This blog is devoted to remembrances and essays on general topics, including literature and writing. It has evolved over time, and some older posts on this site might reflect a different perspective and purpose.

New posts on Wednesdays. Email wallacemike8@gmail.com

Tuesday, November 29, 2011

Eyewitness to History


            Tom Wicker, who died last week at the age of 85, was one of Scotty’s boys, a group of journalists — also including Anthony Lewis, Max Frankel and Russell Baker — who were recruited for the New York Times Washington bureau in the late Fifties and early Sixties by its chief, James “Scotty” Reston. It was, along with the team Edward R. Murrow recruited for CBS radio in the late 1930s, one of the greatest assemblies of journalistic talent in the 20th Century.
            Wicker went on to succeed Reston as bureau chief in 1964 and was probably best known as a Times columnist. For a quarter of a century, his “In the Nation” was one of the paper’s op-ed anchors, skeptical of authority and arguing for civil rights and fairness. When Watergate was a breaking story, the two columnists I always looked forward to reading on the subject were Wicker and Mary McGrory of the Washington Star. There was a reason they were on Nixon’s enemies list.
            In addition to his column, Wicker wrote books, narrated documentaries and sometimes intervened in news situations. Most famously, that occurred at Attica, where he was one of a handful of civilians who tried to mediate a solution to the hostage situation at that state prison. It ended badly, with a bloody shootout, but in his book, which appeared a few years later, Wicker was more sympathetic to the inmates than to the authorities.
            His life, however, is inextricably linked to a single event. In the fall of 1963 the Times sent him on a presidential road trip that wasn’t expected to generate much in the way of news. Wicker was riding in a press bus behind John F. Kennedy’s motorcade in Dallas when the shooting started, and was the only Times reporter on the scene. The magnitude of the task facing Wicker that day was described a few years later by Gay Talese in his book The Kingdom and the Power:
            “During the next few hours the details began to pile up … there were truths, half-truths, errors, illusions, rumors, secondhand accounts, thirdhand accounts — all these were passed freely to the press, were circulated among them, and there was very little time to check these facts or allegations. Wicker and the rest (of the reporters at the scene) had to go largely on instinct, the totality of their experience so far in life, their insight into others, a special sense that good reporters develop and use in a time of crisis …
            “Wicker was writing for history … If there were major errors in Wicker’s story, which there were not, they, too, would survive, degrading Wicker among his colleagues but degrading the Times much more among its readers, not only the million or so who would see the story that day,  but also those who would read it a half-century from now, the students and historians who would be turning it up again and again on microfilm.”
            Microfilm? What a blast from the past. After hearing of Wicker’s death, I looked up the story online from the comfort of my home (www.nytimes.com/learning/general/onthisday/big/1122.html) and it was indeed impressive — stunning in its completeness and factual accuracy, especially considering how little was known in the chaos and immediate aftermath of the assassination. All the more so, given that of the 106 paragraphs, only one described something Wicker actually saw with his own eyes: Jackie Kennedy walking out the emergency entrance of Parkland Hospital after the president had been pronounced dead.
            Anybody can send a tweet or shoot a video on a cell phone these days, and those things have indisputably added to the overall level of raw information. But real journalism isn’t about capturing a scene or a moment; it’s about pulling together a broad range of information from a broad range of sources and making knowledgeable, nuanced sense of it, often in a hell of a hurry.  It is no job for an amateur, and it has humbled many a professional. Wicker’s passing and his singular achievement remind us of that at a time when we seem to be forgetting.
           
           
           
           

Friday, November 25, 2011

Basing Policy on Bad Movie Plots


            In two decades of working for a daily newspaper, I can tell you without checking how many stories we ran during that time about someone in our community who used a gun to defend himself and his family in his house. Exactly one.
            Over that same period we ran quite a few stories about people who shot spouses, family members, fellow card players, innocent bystanders and themselves with a gun. There were so many of those I couldn’t give you an exact number.
            So now that the U.S. Supreme Court has ruled that there is an individual right to own a gun, by all means go ahead and buy one if you want. It may make you feel better and more secure, and good for you if it does. But please try not to forget the statistical reality: Having a gun for protection is a comforting illusion, but an illusion nonetheless.
            There’s an obvious reason that guns are so rarely used in self-defense, and no one ever seems to mention it. Simply put, any villain that you need a gun to deal with isn’t likely to give you a chance to use it. Whether he’s breaking into your house or leaping from a dark alley to mug you, he’s going to try to get the drop on you.
            Of course you may be lucky and get enough advance warning to draw your weapon in time. Long odds, but it does happen. You still have to be a better shooter than the bad guy, and you have to be sure you’re really getting the bad guy. One of the stories I wrote for the paper had to do with a kid under the age of 10 who saw a burglar outside, grabbed a gun that was in the house and plugged the intruder — who turned out to be a member of the household who had, fatally as it turned out, lost his key and was looking for another way in on a dark and fateful night.
            In the national security realm, there’s a similarly ogical fallacy involving torture. How many times have you heard someone say, arguing against the non-torture crowd, what if you’d captured a terrorist who knew where a nuclear device was about to be detonated in a major city. Wouldn’t torturing that terrorist to get the information be morally acceptable in the name of saving millions of innocents?
            Perhaps, with grave reservations, it would in theory. But how would such a situation plausibly arise in the real world? To know, with the certainty the situation would require, that the person in custody is a terrorist and is involved in a nuclear bomb plot, you would have to have been tracking that individual and his associates for some time. Any competent police or intelligence agency would round them all up the minute a nuclear plot was suspected, not let them all get away except for the poor wretch headed for the rack.
            And even supposing the other conspirators slipped through the dragnet, they would have to know that their missing comrade might be compromising the mission. He, for his part, might figure that all he has to do is send his interrogators to the wrong place once, giving his people a chance to detonate the bomb. The odds that he’d spill the details of the plot in time for authorities to stop the bombers and save millions are not good. In fact, they’re about as good as the odds that having a gun in your house will some day protect you against a criminal.
           
           

Tuesday, November 22, 2011

A Warm Feeling for Thanksgiving


            Twenty five years ago, on the night before Thanksgiving, we closed the deal on the house in which we now live. We didn’t move in until March, so this will be our twenty-fifth winter and it should be pleasantly different from all the others.
            One of the reasons I suspect the house was on the market for a year before we bought it was the woodstove plunked down in the middle of the living room. It seemed like an afterthought and made a sensible furniture arrangement impossible. When we met with the owners to sign the final papers just before the end of the year, I mentioned that we were planning to take out the woodstove and put a fireplace in the living room wall.
            “I wouldn’t do that if I were you,” said the owner. He was a plumbing contractor, one of the biggest in the county, who had built the home for himself and his wife, then had built and moved into a new one next door. “I put in a newfangled hot-water heating system that’s supposed to be the great new thing, but it hasn’t worked at all. We had to put the stove in because we were so cold the first winter.”
            We didn’t listen to him, but we should have — sort of.
            In the fall of 1987 the weather was pleasant until about Thanksgiving, when the first frost hit. We woke up one morning to find the roof covered with white and the temperature inside the house at 58 degrees. We set the thermostat to 70 and went to work. When we got home at 5 that afternoon, the temperature in the house, with the heat turned on all day, had risen to 62. My mother came to visit us for the holiday and spent most of it wrapped in a blanket, even when she was walking around the house.
            The fireplace worked great, though, as long as you were sitting within a few feet of it. If you were working in the kitchen or sleeping in the bedroom upstairs, not so good.
            So for 24 winters we coped with the cold in various ways — fires, space heaters, heavy sweaters. From time to time we talked about biting the bullet and putting in an entire new heating system, but it never came to anything. Our California winters are fairly mild, as a rule, and the cold was oppressive for only six to eight weeks a year. Six to eight long weeks.
            During a cold snap this spring, we turned the heater on one last time and heard a thud that sounded like an anvil dropped from the Empire State building. The heater was dead altogether, giving out nothing, not even the inadequate amount it had before.
            We waited through the summer, when we never use it anyway, then after doing some homework, called a plumber who specializes in that sort of system. He came over, took a look and scratched his head.
            “This thing been keeping you warm?” he asked. We said it hadn’t. “I’m not surprised,” he said. “It’s only heating the water to 120 degrees instead of 180 like it should. Plus the outlet valve is open where it shouldn’t be, so half the water is going straight to the septic tank instead of into your house. How come you never did anything about this before?”
            Biting her tongue, Linda said that if we’d bought the house from an English teacher, we would have, but since the leading plumber in the county, who had to live in the place himself, had said he couldn’t get the heater to work, we assumed the problem was unfixable.
            Five thousand dollars later, we have a new, energy efficient heater that’s running at the right temperature and pumping the hot water into the heating system, not the septic tank. We can tell the difference every day, and this Thanksgiving we will be giving thanks, among other things, for heat.
           
           

Friday, November 18, 2011

The Team That Always Let You Down


            They say that the closest bond between fans and a team occurs not when the team wins a championship, but rather when it comes achingly close and falls just short. I’ve been there.
            Growing up in the Los Angeles area, I was a fan of the Dodgers and the Rams. There was a decade, the 1970s, when the Rams tormented their fans beyond endurance. Along with Pittsburgh, Dallas, Oakland and Miami, the Rams were one of the NFL’s elite teams that decade, with a defense arguably as good as the fabled Steel Curtain in Pittsburgh. The Rams were also the only team on the list not to win a Super Bowl that decade.
            How did they fail? Let me count the ways.
            • After the 1976 season they traveled to Minnesota to face the Vikings in the NFC Championship game. With the temperature near zero, the Rams took the opening kickoff and burned half the first quarter on a drive that took them to a fourth and goal from the Minnesota six-inch line. As they attempted a field goal, one of the linemen lost his footing on an icy patch of turf, allowing a defender to come through and block the kick, which was returned 99 yards for a touchdown. Final: 24-13, Minnesota.
            • The following year the Rams hosted a much weaker Viking team in Los Angeles, where weather shouldn’t be a factor. When I turned on the TV set to watch, the first sound I heard was Dick Enberg saying, “It’s pouring rain at the Coliseum in Los Angeles.” Three and a half inches fell during the game. Final score in that slopfest: 14-7, Minnesota.
            • In 1978 the Rams handily beat the Dallas Cowboys in the regular season and hosted the NFC championship game in Los Angeles. At halftime it was a 0-0 tie, but the Rams had lost their first and second-string quarterbacks and every running back on the roster to injury. In the second half, the third-string quarterback tried handing off to a defensive back who hadn’t been asked to be a ball carrier since high school. Final:  Dallas 28, Rams 0.
            • Finally, the 1979 team, with only a 9-7 record, made it to the Super Bowl. The Rams were 11-point underdogs to the Steelers, but led 19-17 after three quarters. Terry Bradshaw completed a long touchdown pass to John Stallworth, literally inches over the outstretched fingers of Ram defender Pat Thomas, and the Rams choked in the fourth quarter. Final, 31-19, Pittsburgh.
            After that, the Rams were no longer an elite team. They were good during most of the 1980s but played in the same division as the San Francisco 49ers, who were winning four Super Bowls that decade. A wild-card berth and a quick exit were about all a fan could hope for.
            In the 1990s, they ceased to be even that good. When the team moved to St. Louis in the second half of the decade, I officially terminated my fanhood. A California man, born and bred, simply can not root for a Midwestern team as his first choice.
            Then in 1999, out of nowhere, lightning struck. Backup quarterback Kurt Warner paired with offensive coordinator Mike Martz to produce The Greatest Show on Turf, a passing game that shredded defenses and took the Rams, the St. Louis Rams, to a 13-3 regular season record and a Super Bowl victory over the Tennessee Titans, 23-17.
            During the season Sports Illustrated ran a cover of Warner with the headline, “Who Is This Guy?” My question was where was he in the 1970s, when the Rams were a good quarterback away from a Super Bowl win every year? Life’s a funny old dog, never more so than for the sports fan or team.

Tuesday, November 15, 2011

'Go Ahead and Buy the House'


            The small family newspaper group I used to work for once sent two of its top non-news, business-side executives to a high-level management seminar where they were given the following scenario and asked to come up with an answer.
            Ad revenues have been slumping and cuts are going to have to be made, including, probably, terminating some employees. You are still in the process of figuring out what cuts will be made and in what time frame when an employee likely to be let go comes to you.
            “My spouse just got a promotion,” the employee says, “and we’ve been looking at houses and saw one we want to make an offer on. The only thing that concerns me is that I know business has been slow here, so I wanted to ask if there’s any reason we should wait.”
            That’s a great seminar and real-life question that raises a whole range of judgments a manager has to make involving a variety of ethical and pragmatic business concerns. Given the scope of those concerns and the diversity of employee personalities, there’s probably no one correct answer.
            There is, however, one indisputably wrong answer. That would be to look the employee squarely in the eye and say, “Go ahead and buy the house.”
            Which was exactly what the seminar leaders said you should do. The two people from our company objected strenuously, saying you can’t lie to your employees like that, but in a room of two dozen executives they were the only two who had a problem with it. Everybody else either thought that was a fine thing to do or was reluctant to dissent.
            I remember that story very well because when I first heard it, it was an aha! moment at a couple of levels. Up to that point I hadn’t thought there were such things as seminars that actually taught executives to be slimeballs; I had assumed that was simply an inherent character defect cultivated through practice.
            Even more disturbing, though, was the part of the story where almost no one objected. I tried to imagine the people taking that seminar. I’m sure that many of them were churchgoers, family men and women, highly respected pillars of the community. Yet somehow, in that corporate setting, the simple human decency evident in the rest of their lives went out the window.
            Although I may not have realized it at the time, it was an illustration of how any corporation or other large organization can corrupt the people who work for it. There is an almost irresistible tendency to get caught up in an organization’s imperatives and interests and to confuse those with what’s right. The organization cares only about its bottom line, whatever that is, and the human panoply of values is not always fairly matched against that single-minded focus. Over time you can see how that sort of pressure could lead Wall Street executives to be clueless about how their bonuses are perceived or could lead an official to treat a report of sexual assault like an annoying memo that had to be cleared out of the inbox and kicked over to someone else as soon as possible.
            In my brief management career, I was leaned on to do things I felt uncomfortable about and even did some of them. Maybe it was no big deal, but I’m glad I got out when I did. I’d hate to think that a few years down the road I might have gone to a seminar and remained silent when the facilitator said I should tell my employee to buy the house.                       

Friday, November 11, 2011

In the Presence of the Famous


             Sid Melton died last week in Burbank at the ripe old age of 94. If you’re much under 60, it’s unlikely the name would register, and even if you’re over 60, the name alone might not do it. See his picture, though, and you’d remember.
            Melton was one of those great character actors who had a long run in TV and movies. He’s probably best known for playing “Uncle Charlie Halper,” the owner of the nightclub where Danny Thomas performed in his long-running TV show. Or, perhaps, as Alf Monroe, the inept handyman on Green Acres. Or, if you really want to go back, as Ichabod Mudd, the sidekick of Captain Midnight on the Saturday morning TV show of that name in the early 1950s. To the end of his life, the Times reported, people would come up to Melton and mimic his standing line from that show, “Mudd, with two D’s.”
            When I got the news, however, I remembered something different and more personal. Sid Melton was in the room on my first date in the summer of 1967.
            I’d finally worked up the nerve to ask out a girl from Holy Family High School, who’d taken a summer school drama class with me at Hoover High, my school in Glendale. We went to see Neil Simon’s Barefoot in the Park at the Las Palmas Theatre in Hollywood, then had a pizza afterward at Miceli’s Restaurant next door. I just Googled, and both the theatre and Miceli’s are still around, against all odds, 44 years later.
            Melton had one of the major supporting roles in the play, which I remember as being very funny. And remembering seeing him in person got me to thinking about some of the other Hollywood figures I saw live.
            There was Marlene Dietrich, when she did a one-woman show at the Music Center in Los Angeles. There was John Carradine, at the end of his career and life, starring in a local community college production of The Man Who Came to Dinner, so ill and infirm he could barely say his lines. At a luncheon at UC-Santa Cruz, in the early 1970s, I sat next to Jean Arthur, long-retired by then. A  couple of years later I was in Berkeley for an evening with the legendary Howard Hawks, who directed John Wayne in Red River and Rio Bravo, Bogart and Bacall in The Big Sleep and To Have and Have Not, and Cary Grant in Bringing Up Baby and His Girl Friday.
            No one is alive now who saw Sarah Bernhardt act, and very few people living  today have seen John Barrymore or Helen Hayes in live theater. It won’t be that long before nobody’s around who saw any of the people I’ve seen. Movies and TV will keep their names and images around for a long time to come, but when no one is left who knew them, however fleetingly, they’ll be ghosts.
            Nearly everyone I know has a memory of seeing a movie or TV star live, and they never forget it. There’s something about being in the presence of the real person that can’t be duplicated. And on rare occasions, a chain develops, linking mere mortals with a famous person from long ago. At a Rotary Club meeting, I once shook hands with a man who shook hands with a man who shook hands with President Lincoln, who was the media star of his age. It’s as close to Lincoln as I’ll ever get, and I cherish the connection.
             

Tuesday, November 8, 2011

Everyone Needs an Exit Interview


            Some time back a regular client sent me to interview a person associated with them who had gone on to achieve great distinction. Even though it meant driving more than a hundred miles, they asked me to do the interview in person. The subject was seriously ill and awaiting major surgery, and the client wanted to be sure I could hear what was being said and evaluate the level of fatigue as the interview went on.
            The interview took an hour and a quarter, and the subject never flagged. The voice was strong and clear; the answers lucid and detailed. As I walked out the door afterward I remember thinking, this was probably the best medicine the subject could have received: A chance to look back on past triumphs and important things done well.
            I’ve interviewed other people who were ill or dying and had similar experiences. In one instance the person was so slow and confused when the interview was scheduled that I wondered if we should even be doing it. Then, on interview day, he was so sharp and precise and loquacious that I could hardly get a question in. He died eight weeks later.
            For many years the New York Times had a reporter, Alden Whitman, whose job was to prepare obituaries of famous people, the ones who deserved the full treatment in America’s newspaper of record. They were written and kept on file so they could be pulled out when the moment arrived, whether after a long and public illness (Francisco Franco) or unexpectedly dropping dead on a golf course (Bing Crosby).
            Whitman was almost never turned down when he requested an interview, and no wonder. Whatever one’s fears or expectations for an afterlife might be, knowing that the Times is going to give you the full obituary treatment has to take a bit of the sting out of death.
            In today’s journalism, where nearly everything is done on the fly and on a shoestring, that sort of advance planning rarely occurs. From time to time our local papers are caught flat-footed when a newsmaker from the past shuffles off the mortal coil unexpectedly (or, perhaps more accurately considering the age of those involved, without prior notice to the press).
            When that happens, it’s not unusual for me to get a call from a reporter seeking information or comment on someone whose public life I used to cover. No matter how busy I am, I always take time to help as much as I can, regardless of my feelings for the person. The wicked and the righteous alike deserve a good obituary, and obituaries of the wicked make for more interesting reading.
            What the obituary and the interview have in common are the conferring of importance on a life. When an oral historian comes to the nursing home to do an interview with someone closing in on the end, it’s a validation. It’s saying to the person interviewed, “What you did is still interesting or important, even after all these years. I want to hear about it, and through me, others will, too.”
            One of the great fears we carry with us through life, and one that can grow more acute with age, is that our lives were wasted or poorly spent. The person who’s interviewed late in life has to realize that’s not so. What a tonic. Medicare should pay to have everybody interviewed.
           
           
           

Friday, November 4, 2011

Not Everybody Wants to Be Rich


            In the ideology of economic individualism, which seems to have body-snatched the entire Republican Party, a core belief is that anybody can be rich, so it’s your fault if you’re not. I have a couple of problems with that.
            To begin with, the idea that anybody can get rich is one of those conceptual underpinnings of a democratic society that shouldn’t be mistaken for reality. Anybody can get hit by a meteor, too, but it isn’t going to happen to most of us. The possibility of riches is an example of what I call the Alaska syndrome. It’s great to know all that wilderness is out there, but I’m probably never going to visit it.
            One of my high school teachers used to say that any idiot could make a million dollars (this was back in the days when a million was real money) as long as the only thing that mattered to him was making the million. As soon as anything else became important, he said, the chances of getting rich dropped dramatically. This gets more to the heart of the matter.
            Leaving aside the question of how many people have the temperament to take the risks that lead to wealth, there’s the larger question of how many think the effort is worth it. There’s a cost to making money, and not everyone wants to pay it.
            If you decide to go for wealth, you have to put up your own money or round it up from other people; spend sleepless nights wondering if things will work out; realize that something utterly unforeseen could blow up your plans; and know that bankruptcy is around the corner if you make a mistake or just get a bad break at the wrong time. All this pretty much wipes out the rest of your life, making it hard to take a vacation, spend time with family and so on.
            If, by contrast, you could work a 40-hour-a-week job that paid a decent middle-class salary, allowed you to have dinner with the family most nights, watch football on the weekend, and take a vacation every summer, wouldn’t that be the more sensible way to go? A lot of us would agree, although finding that kind of job is getting tougher and tougher.
            That second path opens a lot of doors for society as well. If you get rich, you generally give money, because that’s what people ask you for. If you’re making a living, you give of yourself because that’s what you have to offer. The people who make a living are typically the ones who run the PTA, lead the church choir and coach youth sports. I’ve known a lot of people like that over the years, and they play a huge part in making a community vibrant.
            When your work is your life, it can become easy to assume that everyone else feels that way, too.  For most people, work is an important part of their life, but only part. An advanced civilization is one that cultivates a policy that encourages the creation of jobs that enable people to make a decent living without consuming their entire life.
            And that civilization needs to have support mechanisms to allow people the freedom to give of themselves to the community. A senior center can give an aging mother a safe place to go and free up her daughter to run the PTA or coach a soccer team. Medicaid covers nursing home costs so a son doesn’t have to take a second job and can continue coaching Little League or running the choir. When those enabling jobs and programs vanish, civil society frays at the ends and becomes, in some degree, less civil.
           
           

Tuesday, November 1, 2011

The Conspiratorial Mind at Work


            Donald Ogden Stewart, one of the resident wits of the Algonquin Round Table, has been credited with the observation that there are two kinds of people in the world: Those who think there are two kinds of people in the world and those who don’t. Include me in the first camp. I think the two kinds of people in the world are those who believe in conspiracies and those who don’t.
            Conspiracies happen, to be sure, but my experience and observation leads me to believe that they are not as common as most people think and are rarely successful for the simple reason that Ben Franklin posited in Poor Richard’s Almanac some two hundred fifty years ago: Three can keep a secret only when two are dead.
            One of my favorite principles, in trying to sort out a tough question, is that of Occam’s Razor, which states, in essence, that the simplest of competing theories should generally be preferred the more complex theories or explanations.
            Consider the application of that principle to an issue such as President Obama’s birth certificate. The simplest theory would be that since he produced a certificate, the genuineness of which was attested to by the governor and the secretary of state of Hawaii, which issued it, we should accept its validity and move on to more important matters.
            To believe otherwise, you would have to believe that those elected officials were lying; that someone thought, nearly fifty years ago, to plant bogus birth announcements in the Honolulu papers, realizing that this phony baby was going to be president; that every candidate, Republican and Democrat, who ran against Obama and checked his background with a microscope decided to be quiet about it; that the same could be said for every single news outlet, which otherwise was starving for a sensational scoop … the head begins to swim — no, flounder — in rough intellectual waters.
            Experts who study such matters say, and it makes sense, that conspiracy theories and paranoid reactions tend to be triggered by shocks and calamities. I’ve often thought, for example, that the assassination of President Kennedy was such a profound shock that many people could cope with it in no other way but to imagine a conspiracy at work. The notion that one nut with a gun could so easily upset the order in which we imagine we live (even though that’s the way the evidence points) is not acceptable. It makes the world too random and chaotic, hence some greater force must have been behind the disruption. Since it’s impossible to prove a negative, that suspicion will never go away because there’s always a chance, however remote, that it could turn out to be right.
            The crash of the economy in 2008 provided fodder for conspiracy buffs of all stripes. It didn’t happen because normal people (many of whom you wouldn’t mind having as neighbors) did some things they may have felt uneasy about, but also figured couldn’t be that bad. It happened because (take your pick) greedy bankers and Wall Street financial houses or government officials deliberately set out to screw the country. There’s a quote that captures much of the current feeling.
            “How can we account for our present situation unless we believe that men high in this government are concerting to deliver us to disaster? This must be the product of a great conspiracy, a conspiracy on a scale so immense as to dwarf any previous such venture in the history of man.”
            The quote was made on the Senate floor by the junior Republican from Wisconsin, Sen. Joe McCarthy, on June 14, 1951.