This blog is devoted to remembrances and essays on general topics, including literature and writing. It has evolved over time, and some older posts on this site might reflect a different perspective and purpose.

New posts on Wednesdays. Email wallacemike8@gmail.com

Friday, December 30, 2011

Criticism: A Game of Shadows


            We went to see Sherlock Holmes: A Game of Shadows on Christmas Eve and had a great time. Even Linda, who came along reluctantly, said she had a blast, and is now looking forward to renting the first Holmes film on Netflix.
            When I first heard that a Sherlock Holmes movie was coming out two years ago, with Robert Downey, Jr., as Holmes, my immediate, spontaneous reaction was, “What inspired casting!” Downey delivered the goods, but more surprising to me was the performance of Jude Law as Watson. He did a great job of capturing the essence of the character as Conan Doyle wrote him, and the interplay between Law and Downey was lots of fun to watch.
            That first film was hardly perfect (nor was Game of Shadows, for that matter) and it certainly took some liberties in turning Holmes into more of an action character. But some of that was to be expected, and compared to turning Holmes into a Nazi-hunter, as Hollywood did in the 1940s, with Basil Rathbone as Holmes and Nigel Bruce as Watson, it was hardly an indecent liberty.
Both the recent films were good clean fun that left me feeling as if I’d gotten my money’s worth, and the ending of Game of Shadows, setting up the inevitable sequel in clever and witty fashion, was particularly satisfying. The public seems to agree, having made both movies fairly solid hits.
Neither film, however, seems to have received much respect from the critics, who have pretty much been outright dismissive of both of them. I’m not one of those people who belittles the critics out of hand; they often play a valuable role in drawing attention to films by lesser-known directors who don’t have the backing of the full-throttle Hollywood publicity and advertising machine. But the reaction to the Holmes franchise (and that’s what it’s becoming) strikes me as an example of what I would refer to as the O’Brien paradigm.
It takes its name from my dear friend and mentor, the late Bud O’Brien. In the second half of his life, Bud became a great opera fan, eventually holding season tickets for the San Francisco Opera and going to New York every few years to catch a couple of shows at the Met.
Being a newspaperman and avid reader, Bud always read the reviews and came to find himself puzzled by the fact that the critics rarely lavished more than lukewarm praise on a production, however excellent, in the standard repertoire, say Barber of Seville, Rigoletto, Marriage of Figaro. But when a contemporary opera came out, the same critics invariably hauled out the superlatives.
One day the penny dropped. Bud was reading a column by a long-time opera critic, in which the critic happened to mention that he had seen La Traviata more than 60 times. Suddenly it all made sense.
“The season ticket holders at the San Francisco Opera look at the schedule,” Bud told me, “and their reaction is, ‘How wonderful! We get to see La Traviata this year. It’s been a long time.’ This critic, on the other hand, is looking at the schedule and saying, ‘Oh, God! Not #&%$!!@&# La Traviata again!’ His reaction is in another world than that of the audience he’s writing for.”
Movie critics, I suspect, have developed the same lack of regard for conventional fare, even when it’s done very well, and a greater regard for something, anything, that’s new and edgy. When the third Sherlock Holmes film comes out in a couple of years, they probably won’t like it, and I probably won’t let their opinions keep me from the theater.

Tuesday, December 27, 2011

Forever Unsettled


            At a Rotary Club meeting several years ago, we found ourselves listening to a self-proclaimed expert on the dangers of fluoride in the water supply, who painted a vivid and horrific picture of a demon chemical that clearly qualified as God’s biggest mistake. It was compelling stuff, as long as you didn’t stop to think that plenty of large American cities have had a fluoridated water supply for years, with no apparent major problems.
            While the speaker droned on, one of the club liberals (in Rotary a Nixon Republican qualifies as a liberal) leaned over and whispered to me, “I thought this issue was settled 40 years ago.”
            That was what I thought, too, at the time. But lately, so many things that seemed to have been settled issues keep resurfacing, like the serial killer in a series of slasher films, that I am beginning to wonder if there is ever an end to anything.
            It has been nearly a century and a half since the Protocols of the Elders of Zion, a document purporting to prove a Jewish conspiracy against the world, was conclusively proven to be a hoax. And yet, it still keeps coming up as part of some public figure’s belief system. You could argue, I suppose, that the belief systems are largely those of Middle Eastern strongmen, rather than American industrialists, but that’s a cold comfort.
            Even the time frame on these things seems to be shrinking. It used to be that when one of these old chestnuts resurfaced, it was debunked yet again, then faded into obscurity for at least another decade. Now, it seems, when an issue like President Obama’s birth certificate is put to rest, it’s back before us again in a matter of months.
            In my newspaper days there was one that popped up every few years: that the federal government was going to ban religious radio programming. Nothing at all to it, but every time it came back to the surface, we would get a rash of phone calls from anxious and angry people who demanded to know why we had never run a story about it. I don’t recall anyone ever believing me when I told them there was nothing to it.
            The resurfacing of long-settled issues isn’t limited to hoaxes. For the better part of a century, there has been a bipartisan understanding that the tax system should be more or less progressive — requiring those who benefit most from society to pay more for its upkeep — and that corporations should be regulated in the public interest. Debate used to be over the details, but the current Republican Party catechism is that taxes and regulations are inherently bad, which precludes all debate and compromise.
            Some of the bad ideas resurface connected to a revival in popularity of people one would have hoped had been discredited forever. W. Cleon Skousen — a Red-hunter of the fifties and sixties who was so far out there that William F. Buckley, Jr., considered him the type of extremist who discredited conservatism — has re-emerged as one of the intellectual luminaries behind  Glenn Beck’s world view. That view posits an American decline (which ignores rising prosperity, longevity and international influence) dating back to the implementation of Progressive politics beginning in the early 20th Century.
            It makes you wonder what’s next. Will we soon be seeing serious arguments for the restoration of the 60-hour workweek with no minimum wage? How about a public relations campaign in favor of the Jim Crow laws and slavery? The divine right of kings? I once thought all those things had been relegated to the dustbin of history, but I’m not so sure of anything any more.
             
             
           
           
           

Friday, December 23, 2011

Calling Larry Flynt


            A notice came in the e-mail a while back from Network Solutions, which registers my business domain name, asking me to check my contact information and correct it if it was out of date. I finally got around to it and realized they still had an old address and phone number, so I went online to change those two things.
            It took more than an hour.
            Part of that, to be sure, was on me. I hired someone to set up my website (including registering the domain) nearly 15 years ago, and had no idea what my user name and password were or where I could find them. I tried several of the usual suspects with no luck, and finally had to call their help line. After verifying that I was me, they set me up with a link to change my user name (which turned out to be randomly generated numbers) and password.
            I changed the password, and was asked to create three new security questions and answers, which I did.  Then I tried to log on to change the contact information and was again asked to create three new security questions and answers from the same menu. With an appointment on the horizon and a growing sense that this was not going to be easy, I logged out and decided to come back the next day.
            When I did, I created the same three security questions as before, then went to the manage accounts section. I clicked on change contact information, and a window came up asking me what I wanted to change my name to. I didn’t want to change my name, but there was no option for leaving it alone, so I clicked continue, which opened another irrelevant window; clicked continue yet again, which opened another irrelevant window; clicked continue again, and wound up back at the first window, asking me what I wanted to change my name to.
            Time for the phone again, so I called customer service and explained my predicament to the representative. He tried to walk me through the process, but I kept ending up back where I started. Finally, and apparently sensing he had a customer about to go postal, he said he’d change the address and phone number for me. I still don’t know how he did it, but thank God, it’s done.
            Wonderful as the internet can be, this sort of thing happens way too often, and doing something online is always a crapshoot. Sometimes it’s a piece of cake, and sometimes it takes an hour and a half to register a gift card. I have been reduced to a quivering puddle of frustration by the web sites of big-name companies with reputations for high levels of competence — Starbuck’s and the New Yorker for example. In addition to having to know far too many user names and passwords, there are problems with unclear terminology, nonsensical directions, the thing you’re looking for hidden on a too-busy page, and browser differentials that can mean that the box you’re told to click below might actually be above and to the left.
            As the editor of Hustler, Larry Flynt used to dumb-guy-proof his articles (Imagine! He actually thought someone might read them,) by having someone look them over with no other object than making sure that any idiot with a basic vocabulary would know what they were talking about. Until companies do a better job of dumb-guy proofing their web sites, the online crapshoot will continue and cuss words will never go out of style.

Tuesday, December 20, 2011

Buying Books in Different Ways


            Sunday night I was reading the New York Times book section on my iPad and came across a favorable review by Leisl Schillinger of the new P.D. James volume, Death Comes to Pemberley. At the age of 91, James, one of Britain’s most honored mystery writers, has paid homage to Jane Austen by writing a mystery featuring the characters in Pride and Prejudice, six years after the ending of that book.
            Knowing that my wife, Linda, is an avid Jane-ite, I e-mailed her the review with a query as to whether we should buy it. She answered in the affirmative and checked its availability on Amazon; the Kindle edition was $12.99
            Going to Kindle, I called up the book, bought it using the one-click purchase option, and it was downloaded on to the iPad in three minutes. While I was on the Kindle site, I also bought and downloaded Richard Brookhiser’s James Madison. That night Linda read the first two thirds of the book (Pemberley, not Madison).
            This was way too easy, and a refutation of Carrie Fischer’s remark that the problem with instant gratification is that it’s not quick enough.
            In the same week that I bought the two e-books, Linda and I together bought six dead-tree books at two different bookstores, so for the time being, the old ways are winning. Looking back on the six books I bought (four bookstore, two Kindle) got me to thinking about what the market role of “real” books vs. e-books might look like.
            Kindle and its internet competitors are great when you know what book you want and want it now. If you know the name of the book or its author, it’s a piece of cake to get it. In my experience, once you’ve decided to check out a book online, the idea of self-control becomes an illusion. If you go to the Kindle store looking for a book, you’re going to buy it upwards of 90 percent of the time.
            That, of course, assumes that the book has been digitized. Quite a few older ones haven’t. I recall going to Kindle to look for books by Jack Mann, who wrote some wonderfully atmospheric supernatural thrillers in the 1930s. None were on Kindle and only one was on Amazon as a hardcover book. I’ve had similar experiences looking for the work of authors from 20 or more years back. I have, however, found three Manns at the local used bookstore.
            Three of the dead-tree books I bought were paperbacks by authors I’d read previously and enjoyed. I saw the books while browsing at the two different bookstores and gave them the custom; Kindle wouldn’t have been any cheaper. The fourth was a paperback by an author I’d never heard of, but I found it in the new arrivals section and it looked interesting.
            Bookstores are great for that sort of serendipity, and I find it’s much easier to browse in one than online. Just walking an aisle or checking out a table, you’re more likely to encounter something you hadn’t known existed, and in which you might be interested. I still find online browsing a chore, particularly since I don’t want to rely too heavily on Amazon’s recommendations.
            My sixth book was the Madison biography, which I had seen at one of the two bookstores. It would have come in handy for a Founding Fathers wool-gathering project of mine, but I didn’t want to pay $30 hardcover. At $11.99 for the e-book version, Amazon had a sale.
            So that’s how I’ve been buying books lately, but as the brokerage houses like to remind us, past performance is no guarantee of future results.
           
           
           
           
           

Friday, December 16, 2011

Decline of the True Believers


                  “On the whole, the show has been good, as such things go in the Republic … it has at least brought together a large gang of picturesque characters, and it has given everyone a clear view of (the party’s) candidates and its platform. The former certainly do not emerge from it with anything properly describable as an access of dignity.”
                  No more spot-on description of this year’s Republican presidential debates has been written, and what makes it all the more impressive is that it wasn’t written about them at all. The quote above was taken from H.L. Mencken’s reporting in the Baltimore Evening Sun of the 1948 Progressive Party convention in Philadelphia, which nominated Henry Wallace for president.
                  All of which illustrates the way American politics are constantly shifting. The parties themselves undergo complete reversals of policy at periodic intervals. Consider that in the half century from the beginning of Lincoln’s term to the end of Teddy Roosevelt’s, the Republicans were the party of progress and the Democrats were more conservative. Lincoln was a supporter of labor unions and a believer in a strong federal government that used its muscle to promote economic growth and equality. It was during his presidency that the Homestead Act, arguably the most socialistic piece of legislation in America’s history, was passed, giving 160 acres of land to any American willing to settle it and farm it for five years.
                  Beginning in 1896, with the candidacy of William Jennings Bryan, and culminating with the election of Woodrow Wilson in 1912, the Democrats became the more progressive party, and the Republicans more conservative at the same time. The election of FDR in 1932 drew the line clearly, and on the surface, it would appear to continue so to this day.
                  I say on the surface, because upon further reflection, it becomes obvious that the Republican Party of today has all but abandoned conservative principles. My Webster’s defines conservatism as “a political philosophy based on tradition and social stability, stressing established institutions and preferring gradual development to abrupt change.” That actually sounds quite a bit like today’s Democrats, whose biggest campaign promise is to protect Social Security and Medicare, which, until about a year ago, were considered established institutions.
                  Perhaps the biggest change in American party politics in the past two decades is that the True Believers today are more apt to be found in the Republican Party and on the political right, than among the Democrats and on the left. The modern-day equivalent of Henry Wallace’s Progressive Party — one that adheres to clear and consistent policies with no detours for circumstance, exigency or reality — is the Libertarians. The contemporary equivalent of a 1930s leftist panacea, such as Upton Sinclair’s Ham and Eggs campaign, would be Herman Cain’s 9-9-9 tax plan.
                  The growth of the True Believer mind-set has turned the contemporary Republican Party into a den of orthodoxy. At one debate, the candidates were asked if they would reject a plan that would balance the federal budget 90 percent by making cuts and 10 percent by raising taxes. Ronald Reagan would have jumped at that one. The eight Republican candidates all raised their hands in opposition. Even the Kremlin of old would have had the sense to put a dissenter or two into the mix, if only for appearance’s sake.
                  What it’s come down to is this: It used to be that the True Believers believed in a workers’ paradise, but somewhere along the way the socialist flame burned out. Today’s True Believers seem to believe in a taxpayers’ paradise and every man for himself. That’s supposed to be progress?

Tuesday, December 13, 2011

Showing Up in Person for Society


            Last Friday I drove to the county courthouse, as I do every year at this time, to pay my property tax in person. It’s one of those civic actions — sort of like voting, only more expensive — that make me feel like a member of the community.
            Paying income tax never feels like that. Estimated taxes still get sent by mail to Sacramento or to an IRS regional processing center and the tax return itself is now filed electronically by my accountant. There’s not much connection, if any at all, between the money going out and what it’s buying. That’s a big part of government’s PR problem.
            In the lobby of the courthouse, there’s a directory of all the services, which reminds you of where the tax money is going — law enforcement, parks, roads, planning and so on. You may not approve of it all, but at least you can see it’s going somewhere.
            Most years I go in on the day the tax is due, and there’s a line going out the door of the tax collector’s office. As often as not, there’s someone in the line that I know, and we strike up a brief conversation. I suspect that I’m not the only one who feels a bit more connected for paying in person, rather than by mail or online. This year I went in the day before the deadline, and there were only a couple of people in line and it was over in a minute or two.
            I really hope that was because I was early, and not because people don’t want to be bothered by doing things in person. We are losing far too many civic rituals to the Internet and other forms of electronic automation, even to snail mail, as with voting by absentee ballot.
            For many years, it used to be a big thing to go down to the courthouse on election night. I live in a small (by California standards) and politically passionate county, where city council and county races are hotly contested, as are local ballot measures. Election night was one time that people of all political stripes were together in the same basement room, and astonishingly cordial.
            It was a true community gathering in the best sense, and there were memorable moments. In 1994 I worked as a consultant on the campaign of the late Kathleen Akao, a local attorney who challenged an unpopular gubernatorial appointee for superior court judge. In the county’s 144-year history, no sitting judge had ever lost an election, but Akao campaigned hard, drew support from every segment of the community, and pulled off a stunning upset, winning with 54 percent of the vote.
            By ten o’clock that night, it was clear that she would win, and about a half hour later, she walked into the room, which erupted in spontaneous applause. Democracy in action, and all the more impressive because of the cross-section of people present.
            Four years later, in 1998, the county got new computers and began to put election results online. All at once there was nobody in the basement of the courthouse on election night. Instead, people went to the victory party of their favorite candidate and huddled around the computer to find out who was winning. There were no chance encounters with people on the other side.
            From a technological point of view it was unquestionably an improvement over having people running downstairs from the elections office and putting the latest results on the board. From a human and community standpoint, something was irretrievably lost in the switch. I, for one, miss it.


Friday, December 9, 2011

Echoes of 1948 and 1972


            It’s far too early to be making any predictions about next year’s presidential elections. We don’t know who the Republican nominee will be, we don’t know if there will be viable candidates from a third or fourth party, and we don’t know what’s going to happen in the world in the next ten months or so.
            With that qualification, I can’t help looking at what we can see now and being reminded (given a propensity for glib historical analogies) of two other elections gone by, where a lot of similar factors were in play.
            Barack Obama took office in the most tumultuous set of circumstances inherited by any president since Harry Truman, and there are some intriguing comparisons with Truman in his first term. Like Truman, Obama had to make a lot of tough decisions right away and pleased almost no one. Like Truman, he saw the Republicans decisively take over Congress in the mid-term elections. Like Truman, he has been scorned by many in his party, but unlike Truman he doesn’t appear to be in any danger of facing extra-party campaigns begun by disgruntled Democrats.
            For all those problems, plus some economic difficulties (inflation and postwar scarcity of some items), Truman ran a strong campaign, positioned himself as the friend of the average American, attacked the “do-nothing Congress,” and stunned everybody by defeating the excessively stiff New York Governor Thomas Dewey, who was once described by Dorothy Parker as looking “like the little man on top of the wedding cake.”
            That brings us to another troubled incumbent and another presidential election. Richard Nixon was elected president with 43 percent of the vote in an election where third-party candidate George Wallace polled more than 12 percent. Nixon was the most personality-deficient president of the television era, and he came before the voters for re-election in 1972 at a time when the economy was going through a period of inflationary shakiness.
            Nixon and Obama had some points in common. Like Obama, Nixon came to office inheriting a huge problem, the Vietnam War, and in four years he had neither won it nor gotten the country out of it. Like Obama, Nixon was a pragmatic moderate, whose moderation was appreciated by nether his party nor the opposition. And like Obama, Nixon drove many people in the opposition party absolutely out of their senses — bug-eyed, drooling crazy. It was the latter quality that got him re-elected.
             One of the best things in favor of a candidate is a vulnerable opponent. Many Democrats had become so rabid about Vietnam by 1972 that their primary criterion for a candidate was that he be as passionate against the war as they were. They ended up getting Sen. George McGovern of South Dakota, a smart and decent man who didn’t deserve all his followers.
            Despite his inability to connect personally with most voters, and despite his failure to resolve the biggest problem he was handed, Nixon made himself look better by attacking his opponent. Calling the Democrats the party of acid, amnesty and abortion, he rode those attacks to victory with nearly 62 percent of the vote.
            Truman and Nixon both showed that an incumbent president of some ability can win despite problems in his record. They both demonstrated, to differing extents, the value of having a problematic opponent. Obama is certainly a president of some ability, with a better record than either his opponents or many in his own party give him credit for. And it looks as if he’ll be lucky in his opponent. The Republicans appear poised to nominate either their own fringe-element darling or else Mitt Romney, who, come to think of it, reminds you of the little man on top of the wedding cake.
             

Tuesday, December 6, 2011

Mysteries, When the Puzzle Mattered


            The period between 1920 and 1940 is generally known as the Golden Age of the detective novel. England, at the time, was the most civil and orderly of countries, but in popular fiction, its citizens were being dispatched with great regularity and ingenuity. In addition to garden-variety shootings, knifings and throttlings, murders were committed with poisons injected into eggs, crossbows, and poison darts. It was a great time for a fictional character to die in style.
            Some of the best-known authors of that period are still in print or readily available at used bookstores. In England there were, among others, Agatha Christie, Ngaio Marsh, Margery Allingham, Dorothy Sayers, Josephine Tey, Anthony Berkeley, Cyril Hare and Father Ronald A. Knox. America contributed Ellery Queen, S.S. Van Dyne, and the hybrid John Dickson Carr (also writing as Carter Dickson), whose stories were set in England with British detectives.
            Back in the day, the puzzle was paramount, and the game was for the reader to match wits with the author and try to deduce (or guess, if reason failed) the identity of the killer. Most mysteries today don’t even try to include that challenge, but at the time it was taken very seriously indeed. Carr, master of the locked-room mystery, devoted an entire chapter of his The Three Coffins to an elucidation of all the methods by which someone could be murdered inside a locked room, or other secure space.
            It was not uncommon in those days for books to have a “challenge to the reader” near the end, advising that the clues had all been laid out and that, based on them, deduction of the killer ought to be entirely possible. Father Knox carried the concept of fair play with the reader to the point of codification. He laid down ten commandments (hey, he was a priest!), such as “No more than one secret room or passage is allowable … No hitherto undiscovered poisons may be used.” The rules were a warning against conventions that don’t exist today.
            Even at the time the conventions and wild plots — especially in the hands of authors less skilled than those previously mentioned — were an object of satire. Consider this bit of dialogue from a P.G . Wodehouse short story, “Strychnine in the Soup,” published in 1932:
            “It was not the vicar,” he said. “I happen to have read The Murglow Manor Mystery. The guilty man was the plumber … he fastened a snake in the nozzle of the shower-bath with glue, and when Sir Geoffrey turned on the stream, the hot water melted the glue. This released the snake, which dropped through one of the holes, bit the Baronet in the leg, and disappeared down the drain pipe.”
            Wodehouse was having double fun on that one, given that The Murglow Manor Mystery clearly cribbed from Conan Doyle and the Speckled Band. They don’t write them like that any more.
            Three quarters of a century later it’s easy to look back on those books with a condescending smile. As stylishly as some of them were written, they don’t begin to stack up against the best mystery and crime novels of today in terms of character development, social awareness and complexity of theme; also the casual racist, classist and colonialist attitudes are tough swallowing for a contemporary reader. But in the best of them, the sense of innocent joy is still contagious. It was the Golden Age not because of the literary quality, but because of the fun and exuberance of a genre discovering itself. Open one of those books, even now, and the game is on, in a way it isn’t nor likely will be in a mystery written in our time.

           
           
           
           

Friday, December 2, 2011

An American in Paris


            Writing as Poor Richard, Benjamin Franklin was a proponent of frugality and the debt-free life. “The second vice is lying; the first is running in debt,” he wrote in a fairly typical entry for the 1748 edition of the Almanack.
            So there was more than a little irony in the fact that perhaps Franklin’s greatest service to his country was borrowing a ton of money from France to finance the War of Independence. Our freedom from England was, figuratively speaking, charged to a credit card and not entirely paid off.
            Over the recent holiday I brushed up on the topic by reading Stacy Schiff’s wonderful book A Great Improvisation: Franklin, France and the Birth of America, published in 2003. It outlines in entertaining detail Franklin’s years in France as the American representative, trying to cajole money from a government that was having a hard time meeting its own obligations.
            The weakness of the American appeal was not lost on the French. Charles Gravier, Comte de Vergennes, the French Minister of Foreign Affairs, at one point, according to Schiff, told the American Congress of “his astonishment that an independence-obsessed republic continued to draw for its defense on a foreign monarch rather than taxing its citizens.“ Some two and a quarter centuries later, we were borrowing money from China, rather than taxing ourselves, to pay for our military adventures in Afghanistan and Iraq. Nations, like people, apparently don’t change that much.
            France, to be sure, had an interest in vexing England, its long-term enemy. But it is not at all clear that the self-interest alone would have led to a substantial outlay in money to fund a war that had every appearance of being a lost cause. Someone had to ask, and Franklin, feeling his way around a foreign country with almost no direction or support from home, was the right man in the right place.
            His reputation as a scientist and inventor preceded him and made him, commoner though he was, a respected man at the highest levels of state. His embrace of French culture and society endeared him to his host country. Patiently working from those starting blocks, Franklin carried off one of history’s greatest triumphs of diplomacy by personality.
            It was a triumph greatly unappreciated at the time, especially back home. The Continental Congress was as dysfunctional then as Congress is today, and Franklin was frequently presented to that body in the worst possible light by his detractors, who included John Adams and Arthur Lee, who was supposed to be helping Franklin negotiate a treaty with France and raise money.
            Franklin’s performance, to be fair, gave his critics plenty of ammunition. He was hopeless as an administrator, made no attempt to protect official secrets (not that it would have mattered much in the spy-infested Paris of the time), and was frequently betrayed, financially and otherwise, by people he had trusted too much.
            His record, in that regard, illustrates a fine historical point: A public official can make plenty of mistakes as long as he or she does a couple of big things right. Franklin’s big things were the treaty with France and the money that came along with it; at the birth of this country, almost nothing was bigger.
            His supreme gift, Schiff writes of Franklin, “was his very flexibility. He was the opportunistic envoy from the land of opportunity that is the United States. His was an initial display of America’s scrappy, improvisatory genius; it is the gift Falstaff gives Hal.” It was the gift that made Franklin, in Walter Isaacson’s words, the Founding Father who winks at us.
           
           

Tuesday, November 29, 2011

Eyewitness to History


            Tom Wicker, who died last week at the age of 85, was one of Scotty’s boys, a group of journalists — also including Anthony Lewis, Max Frankel and Russell Baker — who were recruited for the New York Times Washington bureau in the late Fifties and early Sixties by its chief, James “Scotty” Reston. It was, along with the team Edward R. Murrow recruited for CBS radio in the late 1930s, one of the greatest assemblies of journalistic talent in the 20th Century.
            Wicker went on to succeed Reston as bureau chief in 1964 and was probably best known as a Times columnist. For a quarter of a century, his “In the Nation” was one of the paper’s op-ed anchors, skeptical of authority and arguing for civil rights and fairness. When Watergate was a breaking story, the two columnists I always looked forward to reading on the subject were Wicker and Mary McGrory of the Washington Star. There was a reason they were on Nixon’s enemies list.
            In addition to his column, Wicker wrote books, narrated documentaries and sometimes intervened in news situations. Most famously, that occurred at Attica, where he was one of a handful of civilians who tried to mediate a solution to the hostage situation at that state prison. It ended badly, with a bloody shootout, but in his book, which appeared a few years later, Wicker was more sympathetic to the inmates than to the authorities.
            His life, however, is inextricably linked to a single event. In the fall of 1963 the Times sent him on a presidential road trip that wasn’t expected to generate much in the way of news. Wicker was riding in a press bus behind John F. Kennedy’s motorcade in Dallas when the shooting started, and was the only Times reporter on the scene. The magnitude of the task facing Wicker that day was described a few years later by Gay Talese in his book The Kingdom and the Power:
            “During the next few hours the details began to pile up … there were truths, half-truths, errors, illusions, rumors, secondhand accounts, thirdhand accounts — all these were passed freely to the press, were circulated among them, and there was very little time to check these facts or allegations. Wicker and the rest (of the reporters at the scene) had to go largely on instinct, the totality of their experience so far in life, their insight into others, a special sense that good reporters develop and use in a time of crisis …
            “Wicker was writing for history … If there were major errors in Wicker’s story, which there were not, they, too, would survive, degrading Wicker among his colleagues but degrading the Times much more among its readers, not only the million or so who would see the story that day,  but also those who would read it a half-century from now, the students and historians who would be turning it up again and again on microfilm.”
            Microfilm? What a blast from the past. After hearing of Wicker’s death, I looked up the story online from the comfort of my home (www.nytimes.com/learning/general/onthisday/big/1122.html) and it was indeed impressive — stunning in its completeness and factual accuracy, especially considering how little was known in the chaos and immediate aftermath of the assassination. All the more so, given that of the 106 paragraphs, only one described something Wicker actually saw with his own eyes: Jackie Kennedy walking out the emergency entrance of Parkland Hospital after the president had been pronounced dead.
            Anybody can send a tweet or shoot a video on a cell phone these days, and those things have indisputably added to the overall level of raw information. But real journalism isn’t about capturing a scene or a moment; it’s about pulling together a broad range of information from a broad range of sources and making knowledgeable, nuanced sense of it, often in a hell of a hurry.  It is no job for an amateur, and it has humbled many a professional. Wicker’s passing and his singular achievement remind us of that at a time when we seem to be forgetting.
           
           
           
           

Friday, November 25, 2011

Basing Policy on Bad Movie Plots


            In two decades of working for a daily newspaper, I can tell you without checking how many stories we ran during that time about someone in our community who used a gun to defend himself and his family in his house. Exactly one.
            Over that same period we ran quite a few stories about people who shot spouses, family members, fellow card players, innocent bystanders and themselves with a gun. There were so many of those I couldn’t give you an exact number.
            So now that the U.S. Supreme Court has ruled that there is an individual right to own a gun, by all means go ahead and buy one if you want. It may make you feel better and more secure, and good for you if it does. But please try not to forget the statistical reality: Having a gun for protection is a comforting illusion, but an illusion nonetheless.
            There’s an obvious reason that guns are so rarely used in self-defense, and no one ever seems to mention it. Simply put, any villain that you need a gun to deal with isn’t likely to give you a chance to use it. Whether he’s breaking into your house or leaping from a dark alley to mug you, he’s going to try to get the drop on you.
            Of course you may be lucky and get enough advance warning to draw your weapon in time. Long odds, but it does happen. You still have to be a better shooter than the bad guy, and you have to be sure you’re really getting the bad guy. One of the stories I wrote for the paper had to do with a kid under the age of 10 who saw a burglar outside, grabbed a gun that was in the house and plugged the intruder — who turned out to be a member of the household who had, fatally as it turned out, lost his key and was looking for another way in on a dark and fateful night.
            In the national security realm, there’s a similarly ogical fallacy involving torture. How many times have you heard someone say, arguing against the non-torture crowd, what if you’d captured a terrorist who knew where a nuclear device was about to be detonated in a major city. Wouldn’t torturing that terrorist to get the information be morally acceptable in the name of saving millions of innocents?
            Perhaps, with grave reservations, it would in theory. But how would such a situation plausibly arise in the real world? To know, with the certainty the situation would require, that the person in custody is a terrorist and is involved in a nuclear bomb plot, you would have to have been tracking that individual and his associates for some time. Any competent police or intelligence agency would round them all up the minute a nuclear plot was suspected, not let them all get away except for the poor wretch headed for the rack.
            And even supposing the other conspirators slipped through the dragnet, they would have to know that their missing comrade might be compromising the mission. He, for his part, might figure that all he has to do is send his interrogators to the wrong place once, giving his people a chance to detonate the bomb. The odds that he’d spill the details of the plot in time for authorities to stop the bombers and save millions are not good. In fact, they’re about as good as the odds that having a gun in your house will some day protect you against a criminal.
           
           

Tuesday, November 22, 2011

A Warm Feeling for Thanksgiving


            Twenty five years ago, on the night before Thanksgiving, we closed the deal on the house in which we now live. We didn’t move in until March, so this will be our twenty-fifth winter and it should be pleasantly different from all the others.
            One of the reasons I suspect the house was on the market for a year before we bought it was the woodstove plunked down in the middle of the living room. It seemed like an afterthought and made a sensible furniture arrangement impossible. When we met with the owners to sign the final papers just before the end of the year, I mentioned that we were planning to take out the woodstove and put a fireplace in the living room wall.
            “I wouldn’t do that if I were you,” said the owner. He was a plumbing contractor, one of the biggest in the county, who had built the home for himself and his wife, then had built and moved into a new one next door. “I put in a newfangled hot-water heating system that’s supposed to be the great new thing, but it hasn’t worked at all. We had to put the stove in because we were so cold the first winter.”
            We didn’t listen to him, but we should have — sort of.
            In the fall of 1987 the weather was pleasant until about Thanksgiving, when the first frost hit. We woke up one morning to find the roof covered with white and the temperature inside the house at 58 degrees. We set the thermostat to 70 and went to work. When we got home at 5 that afternoon, the temperature in the house, with the heat turned on all day, had risen to 62. My mother came to visit us for the holiday and spent most of it wrapped in a blanket, even when she was walking around the house.
            The fireplace worked great, though, as long as you were sitting within a few feet of it. If you were working in the kitchen or sleeping in the bedroom upstairs, not so good.
            So for 24 winters we coped with the cold in various ways — fires, space heaters, heavy sweaters. From time to time we talked about biting the bullet and putting in an entire new heating system, but it never came to anything. Our California winters are fairly mild, as a rule, and the cold was oppressive for only six to eight weeks a year. Six to eight long weeks.
            During a cold snap this spring, we turned the heater on one last time and heard a thud that sounded like an anvil dropped from the Empire State building. The heater was dead altogether, giving out nothing, not even the inadequate amount it had before.
            We waited through the summer, when we never use it anyway, then after doing some homework, called a plumber who specializes in that sort of system. He came over, took a look and scratched his head.
            “This thing been keeping you warm?” he asked. We said it hadn’t. “I’m not surprised,” he said. “It’s only heating the water to 120 degrees instead of 180 like it should. Plus the outlet valve is open where it shouldn’t be, so half the water is going straight to the septic tank instead of into your house. How come you never did anything about this before?”
            Biting her tongue, Linda said that if we’d bought the house from an English teacher, we would have, but since the leading plumber in the county, who had to live in the place himself, had said he couldn’t get the heater to work, we assumed the problem was unfixable.
            Five thousand dollars later, we have a new, energy efficient heater that’s running at the right temperature and pumping the hot water into the heating system, not the septic tank. We can tell the difference every day, and this Thanksgiving we will be giving thanks, among other things, for heat.
           
           

Friday, November 18, 2011

The Team That Always Let You Down


            They say that the closest bond between fans and a team occurs not when the team wins a championship, but rather when it comes achingly close and falls just short. I’ve been there.
            Growing up in the Los Angeles area, I was a fan of the Dodgers and the Rams. There was a decade, the 1970s, when the Rams tormented their fans beyond endurance. Along with Pittsburgh, Dallas, Oakland and Miami, the Rams were one of the NFL’s elite teams that decade, with a defense arguably as good as the fabled Steel Curtain in Pittsburgh. The Rams were also the only team on the list not to win a Super Bowl that decade.
            How did they fail? Let me count the ways.
            • After the 1976 season they traveled to Minnesota to face the Vikings in the NFC Championship game. With the temperature near zero, the Rams took the opening kickoff and burned half the first quarter on a drive that took them to a fourth and goal from the Minnesota six-inch line. As they attempted a field goal, one of the linemen lost his footing on an icy patch of turf, allowing a defender to come through and block the kick, which was returned 99 yards for a touchdown. Final: 24-13, Minnesota.
            • The following year the Rams hosted a much weaker Viking team in Los Angeles, where weather shouldn’t be a factor. When I turned on the TV set to watch, the first sound I heard was Dick Enberg saying, “It’s pouring rain at the Coliseum in Los Angeles.” Three and a half inches fell during the game. Final score in that slopfest: 14-7, Minnesota.
            • In 1978 the Rams handily beat the Dallas Cowboys in the regular season and hosted the NFC championship game in Los Angeles. At halftime it was a 0-0 tie, but the Rams had lost their first and second-string quarterbacks and every running back on the roster to injury. In the second half, the third-string quarterback tried handing off to a defensive back who hadn’t been asked to be a ball carrier since high school. Final:  Dallas 28, Rams 0.
            • Finally, the 1979 team, with only a 9-7 record, made it to the Super Bowl. The Rams were 11-point underdogs to the Steelers, but led 19-17 after three quarters. Terry Bradshaw completed a long touchdown pass to John Stallworth, literally inches over the outstretched fingers of Ram defender Pat Thomas, and the Rams choked in the fourth quarter. Final, 31-19, Pittsburgh.
            After that, the Rams were no longer an elite team. They were good during most of the 1980s but played in the same division as the San Francisco 49ers, who were winning four Super Bowls that decade. A wild-card berth and a quick exit were about all a fan could hope for.
            In the 1990s, they ceased to be even that good. When the team moved to St. Louis in the second half of the decade, I officially terminated my fanhood. A California man, born and bred, simply can not root for a Midwestern team as his first choice.
            Then in 1999, out of nowhere, lightning struck. Backup quarterback Kurt Warner paired with offensive coordinator Mike Martz to produce The Greatest Show on Turf, a passing game that shredded defenses and took the Rams, the St. Louis Rams, to a 13-3 regular season record and a Super Bowl victory over the Tennessee Titans, 23-17.
            During the season Sports Illustrated ran a cover of Warner with the headline, “Who Is This Guy?” My question was where was he in the 1970s, when the Rams were a good quarterback away from a Super Bowl win every year? Life’s a funny old dog, never more so than for the sports fan or team.

Tuesday, November 15, 2011

'Go Ahead and Buy the House'


            The small family newspaper group I used to work for once sent two of its top non-news, business-side executives to a high-level management seminar where they were given the following scenario and asked to come up with an answer.
            Ad revenues have been slumping and cuts are going to have to be made, including, probably, terminating some employees. You are still in the process of figuring out what cuts will be made and in what time frame when an employee likely to be let go comes to you.
            “My spouse just got a promotion,” the employee says, “and we’ve been looking at houses and saw one we want to make an offer on. The only thing that concerns me is that I know business has been slow here, so I wanted to ask if there’s any reason we should wait.”
            That’s a great seminar and real-life question that raises a whole range of judgments a manager has to make involving a variety of ethical and pragmatic business concerns. Given the scope of those concerns and the diversity of employee personalities, there’s probably no one correct answer.
            There is, however, one indisputably wrong answer. That would be to look the employee squarely in the eye and say, “Go ahead and buy the house.”
            Which was exactly what the seminar leaders said you should do. The two people from our company objected strenuously, saying you can’t lie to your employees like that, but in a room of two dozen executives they were the only two who had a problem with it. Everybody else either thought that was a fine thing to do or was reluctant to dissent.
            I remember that story very well because when I first heard it, it was an aha! moment at a couple of levels. Up to that point I hadn’t thought there were such things as seminars that actually taught executives to be slimeballs; I had assumed that was simply an inherent character defect cultivated through practice.
            Even more disturbing, though, was the part of the story where almost no one objected. I tried to imagine the people taking that seminar. I’m sure that many of them were churchgoers, family men and women, highly respected pillars of the community. Yet somehow, in that corporate setting, the simple human decency evident in the rest of their lives went out the window.
            Although I may not have realized it at the time, it was an illustration of how any corporation or other large organization can corrupt the people who work for it. There is an almost irresistible tendency to get caught up in an organization’s imperatives and interests and to confuse those with what’s right. The organization cares only about its bottom line, whatever that is, and the human panoply of values is not always fairly matched against that single-minded focus. Over time you can see how that sort of pressure could lead Wall Street executives to be clueless about how their bonuses are perceived or could lead an official to treat a report of sexual assault like an annoying memo that had to be cleared out of the inbox and kicked over to someone else as soon as possible.
            In my brief management career, I was leaned on to do things I felt uncomfortable about and even did some of them. Maybe it was no big deal, but I’m glad I got out when I did. I’d hate to think that a few years down the road I might have gone to a seminar and remained silent when the facilitator said I should tell my employee to buy the house.                       

Friday, November 11, 2011

In the Presence of the Famous


             Sid Melton died last week in Burbank at the ripe old age of 94. If you’re much under 60, it’s unlikely the name would register, and even if you’re over 60, the name alone might not do it. See his picture, though, and you’d remember.
            Melton was one of those great character actors who had a long run in TV and movies. He’s probably best known for playing “Uncle Charlie Halper,” the owner of the nightclub where Danny Thomas performed in his long-running TV show. Or, perhaps, as Alf Monroe, the inept handyman on Green Acres. Or, if you really want to go back, as Ichabod Mudd, the sidekick of Captain Midnight on the Saturday morning TV show of that name in the early 1950s. To the end of his life, the Times reported, people would come up to Melton and mimic his standing line from that show, “Mudd, with two D’s.”
            When I got the news, however, I remembered something different and more personal. Sid Melton was in the room on my first date in the summer of 1967.
            I’d finally worked up the nerve to ask out a girl from Holy Family High School, who’d taken a summer school drama class with me at Hoover High, my school in Glendale. We went to see Neil Simon’s Barefoot in the Park at the Las Palmas Theatre in Hollywood, then had a pizza afterward at Miceli’s Restaurant next door. I just Googled, and both the theatre and Miceli’s are still around, against all odds, 44 years later.
            Melton had one of the major supporting roles in the play, which I remember as being very funny. And remembering seeing him in person got me to thinking about some of the other Hollywood figures I saw live.
            There was Marlene Dietrich, when she did a one-woman show at the Music Center in Los Angeles. There was John Carradine, at the end of his career and life, starring in a local community college production of The Man Who Came to Dinner, so ill and infirm he could barely say his lines. At a luncheon at UC-Santa Cruz, in the early 1970s, I sat next to Jean Arthur, long-retired by then. A  couple of years later I was in Berkeley for an evening with the legendary Howard Hawks, who directed John Wayne in Red River and Rio Bravo, Bogart and Bacall in The Big Sleep and To Have and Have Not, and Cary Grant in Bringing Up Baby and His Girl Friday.
            No one is alive now who saw Sarah Bernhardt act, and very few people living  today have seen John Barrymore or Helen Hayes in live theater. It won’t be that long before nobody’s around who saw any of the people I’ve seen. Movies and TV will keep their names and images around for a long time to come, but when no one is left who knew them, however fleetingly, they’ll be ghosts.
            Nearly everyone I know has a memory of seeing a movie or TV star live, and they never forget it. There’s something about being in the presence of the real person that can’t be duplicated. And on rare occasions, a chain develops, linking mere mortals with a famous person from long ago. At a Rotary Club meeting, I once shook hands with a man who shook hands with a man who shook hands with President Lincoln, who was the media star of his age. It’s as close to Lincoln as I’ll ever get, and I cherish the connection.
             

Tuesday, November 8, 2011

Everyone Needs an Exit Interview


            Some time back a regular client sent me to interview a person associated with them who had gone on to achieve great distinction. Even though it meant driving more than a hundred miles, they asked me to do the interview in person. The subject was seriously ill and awaiting major surgery, and the client wanted to be sure I could hear what was being said and evaluate the level of fatigue as the interview went on.
            The interview took an hour and a quarter, and the subject never flagged. The voice was strong and clear; the answers lucid and detailed. As I walked out the door afterward I remember thinking, this was probably the best medicine the subject could have received: A chance to look back on past triumphs and important things done well.
            I’ve interviewed other people who were ill or dying and had similar experiences. In one instance the person was so slow and confused when the interview was scheduled that I wondered if we should even be doing it. Then, on interview day, he was so sharp and precise and loquacious that I could hardly get a question in. He died eight weeks later.
            For many years the New York Times had a reporter, Alden Whitman, whose job was to prepare obituaries of famous people, the ones who deserved the full treatment in America’s newspaper of record. They were written and kept on file so they could be pulled out when the moment arrived, whether after a long and public illness (Francisco Franco) or unexpectedly dropping dead on a golf course (Bing Crosby).
            Whitman was almost never turned down when he requested an interview, and no wonder. Whatever one’s fears or expectations for an afterlife might be, knowing that the Times is going to give you the full obituary treatment has to take a bit of the sting out of death.
            In today’s journalism, where nearly everything is done on the fly and on a shoestring, that sort of advance planning rarely occurs. From time to time our local papers are caught flat-footed when a newsmaker from the past shuffles off the mortal coil unexpectedly (or, perhaps more accurately considering the age of those involved, without prior notice to the press).
            When that happens, it’s not unusual for me to get a call from a reporter seeking information or comment on someone whose public life I used to cover. No matter how busy I am, I always take time to help as much as I can, regardless of my feelings for the person. The wicked and the righteous alike deserve a good obituary, and obituaries of the wicked make for more interesting reading.
            What the obituary and the interview have in common are the conferring of importance on a life. When an oral historian comes to the nursing home to do an interview with someone closing in on the end, it’s a validation. It’s saying to the person interviewed, “What you did is still interesting or important, even after all these years. I want to hear about it, and through me, others will, too.”
            One of the great fears we carry with us through life, and one that can grow more acute with age, is that our lives were wasted or poorly spent. The person who’s interviewed late in life has to realize that’s not so. What a tonic. Medicare should pay to have everybody interviewed.
           
           
           

Friday, November 4, 2011

Not Everybody Wants to Be Rich


            In the ideology of economic individualism, which seems to have body-snatched the entire Republican Party, a core belief is that anybody can be rich, so it’s your fault if you’re not. I have a couple of problems with that.
            To begin with, the idea that anybody can get rich is one of those conceptual underpinnings of a democratic society that shouldn’t be mistaken for reality. Anybody can get hit by a meteor, too, but it isn’t going to happen to most of us. The possibility of riches is an example of what I call the Alaska syndrome. It’s great to know all that wilderness is out there, but I’m probably never going to visit it.
            One of my high school teachers used to say that any idiot could make a million dollars (this was back in the days when a million was real money) as long as the only thing that mattered to him was making the million. As soon as anything else became important, he said, the chances of getting rich dropped dramatically. This gets more to the heart of the matter.
            Leaving aside the question of how many people have the temperament to take the risks that lead to wealth, there’s the larger question of how many think the effort is worth it. There’s a cost to making money, and not everyone wants to pay it.
            If you decide to go for wealth, you have to put up your own money or round it up from other people; spend sleepless nights wondering if things will work out; realize that something utterly unforeseen could blow up your plans; and know that bankruptcy is around the corner if you make a mistake or just get a bad break at the wrong time. All this pretty much wipes out the rest of your life, making it hard to take a vacation, spend time with family and so on.
            If, by contrast, you could work a 40-hour-a-week job that paid a decent middle-class salary, allowed you to have dinner with the family most nights, watch football on the weekend, and take a vacation every summer, wouldn’t that be the more sensible way to go? A lot of us would agree, although finding that kind of job is getting tougher and tougher.
            That second path opens a lot of doors for society as well. If you get rich, you generally give money, because that’s what people ask you for. If you’re making a living, you give of yourself because that’s what you have to offer. The people who make a living are typically the ones who run the PTA, lead the church choir and coach youth sports. I’ve known a lot of people like that over the years, and they play a huge part in making a community vibrant.
            When your work is your life, it can become easy to assume that everyone else feels that way, too.  For most people, work is an important part of their life, but only part. An advanced civilization is one that cultivates a policy that encourages the creation of jobs that enable people to make a decent living without consuming their entire life.
            And that civilization needs to have support mechanisms to allow people the freedom to give of themselves to the community. A senior center can give an aging mother a safe place to go and free up her daughter to run the PTA or coach a soccer team. Medicaid covers nursing home costs so a son doesn’t have to take a second job and can continue coaching Little League or running the choir. When those enabling jobs and programs vanish, civil society frays at the ends and becomes, in some degree, less civil.
           
           

Tuesday, November 1, 2011

The Conspiratorial Mind at Work


            Donald Ogden Stewart, one of the resident wits of the Algonquin Round Table, has been credited with the observation that there are two kinds of people in the world: Those who think there are two kinds of people in the world and those who don’t. Include me in the first camp. I think the two kinds of people in the world are those who believe in conspiracies and those who don’t.
            Conspiracies happen, to be sure, but my experience and observation leads me to believe that they are not as common as most people think and are rarely successful for the simple reason that Ben Franklin posited in Poor Richard’s Almanac some two hundred fifty years ago: Three can keep a secret only when two are dead.
            One of my favorite principles, in trying to sort out a tough question, is that of Occam’s Razor, which states, in essence, that the simplest of competing theories should generally be preferred the more complex theories or explanations.
            Consider the application of that principle to an issue such as President Obama’s birth certificate. The simplest theory would be that since he produced a certificate, the genuineness of which was attested to by the governor and the secretary of state of Hawaii, which issued it, we should accept its validity and move on to more important matters.
            To believe otherwise, you would have to believe that those elected officials were lying; that someone thought, nearly fifty years ago, to plant bogus birth announcements in the Honolulu papers, realizing that this phony baby was going to be president; that every candidate, Republican and Democrat, who ran against Obama and checked his background with a microscope decided to be quiet about it; that the same could be said for every single news outlet, which otherwise was starving for a sensational scoop … the head begins to swim — no, flounder — in rough intellectual waters.
            Experts who study such matters say, and it makes sense, that conspiracy theories and paranoid reactions tend to be triggered by shocks and calamities. I’ve often thought, for example, that the assassination of President Kennedy was such a profound shock that many people could cope with it in no other way but to imagine a conspiracy at work. The notion that one nut with a gun could so easily upset the order in which we imagine we live (even though that’s the way the evidence points) is not acceptable. It makes the world too random and chaotic, hence some greater force must have been behind the disruption. Since it’s impossible to prove a negative, that suspicion will never go away because there’s always a chance, however remote, that it could turn out to be right.
            The crash of the economy in 2008 provided fodder for conspiracy buffs of all stripes. It didn’t happen because normal people (many of whom you wouldn’t mind having as neighbors) did some things they may have felt uneasy about, but also figured couldn’t be that bad. It happened because (take your pick) greedy bankers and Wall Street financial houses or government officials deliberately set out to screw the country. There’s a quote that captures much of the current feeling.
            “How can we account for our present situation unless we believe that men high in this government are concerting to deliver us to disaster? This must be the product of a great conspiracy, a conspiracy on a scale so immense as to dwarf any previous such venture in the history of man.”
            The quote was made on the Senate floor by the junior Republican from Wisconsin, Sen. Joe McCarthy, on June 14, 1951.
           
           

Friday, October 28, 2011

Driving in Red-State California


            Heading north and inland from the San Francisco Bay area, you can see the landscape change abruptly. At Vacaville, Highway 505 branches off to provide a shortcut for northbound travelers between I-80 and I-5. Genentech has a big complex at that junction, but once past it, you’re in a different country.
            It’s not just that the landscape is open and sparsely settled. The signs along the road reflect the fact that the culture has changed as well. On a recent fishing trip that took my son and me through that area, we saw signs defensively proclaiming the importance of farmers and agriculture in general; signs, some with gruesome drawings, asserting that Jesus had died and shed blood for our sins; Tea Party signs, and others of similar provenance. A year ago there were a few “Produce the Birth Certificate” signs, but none were in evidence this year.
            Politically California is about as blue a state as you’ll find, but the blueness predominates in the urban areas near the coast, where most of the population lives. The farther inland and closer to the mountains you get, the more it feels like a red state.
            More than a decade ago I took a driving trip through the northeastern part of the state and southern Oregon a week before the presidential election. I literally went three days without seeing a single campaign sign for Al Gore, but there were hundreds for Bush/Cheney. Gore carried both states, but not the parts of them that I was driving through.
            People who live in small towns and rural areas, tend, and this is a generality, to feel that they are misunderstood by the great majority that lives in urban and suburban America. Some politicians — Sarah Palin comes to mind — do a terrific job of pandering to that resentment. And, as with most resentments, there’s a kernel of truth behind the feeling.
            But a lot of the suspicion is extrapolation and exaggeration of some reasonably held positions. I don’t know anyone, among my liberal friends, who hates farmers and agriculture. Most of them, in fact, are highly supportive because they feel farming protects open space while providing a valuable and essential commodity — our food. That many of those same people believe, as I do, that farmers should go easy on mass-produced chemical pesticides and should provide their field workers with decent wages and working conditions is by no means a slam on agriculture, though it seems to be taken as such.
Misunderstanding goes both ways. Many people in the cities abhor guns and hunting, not fully understanding that in small, economically distressed areas, killing a deer and eating its meat is a key part of the winter food budget. Many people in the country, who use a gun in that fashion, don’t seem to comprehend, for their part, that it’s not a good idea to have a lot of drug dealers and gangbangers running around with concealed weapons in a highly populated urban area.
            People who live in the less populated parts of the country are our last Jeffersonians. There’s a strong streak of self-sufficiency, a suspicion of elites and government (the greater the remove, the greater the suspicion), and a belief that community cooperation can take care of most problems. Jefferson felt that way, but he was quick to admit that his philosophy would work only to a certain scale; that once a society became too highly populated, too urbanized, and too industrialized, small government would no longer work. In most of the country, that time has passed, but north of Vacaville, the spirit lingers on.
           
             

Tuesday, October 25, 2011

Tomorrow Is No Sure Thing


            From time to time I find myself thinking about a news story I edited years ago.
            It had to do with a man who worked in an office in a high-rise in a large city in the East or Midwest and was a creature of habit. Every Friday morning he got his paycheck and exactly at noon, he left for his lunch hour, rode the elevator to the ground floor, went into his bank nearby, and deposited the check.
            One week he had something coming up on the weekend and had to run a couple of extra errands. So instead of leaving the office at noon, he left at 11:45, rode the elevator to the ground floor, went into his bank, and walked through the door just as a holdup in progress turned into a shootout. He was killed by a stray bullet, leaving a wife and children to carry on without him, and probably to wonder forever why he couldn’t have left the office at his regular time.
            I thought of that story again recently when I read another one about a 39-year-old woman who left Monterey on a recent Sunday afternoon to ride her motorcycle to her home north of San Francisco.
            At about 4 p.m. on a clear sunny day, just shy of the northern boundary of Monterey County, a car being followed by the Highway Patrol crossed the center lane of the two-lane road she was on. She was killed when she hit it, and the driver of the car was killed when it ran off the road and down an embankment. If she’d left Monterey a few minutes earlier or later, she’d still be alive.
            Over the years I’ve written and edited more stories than I care to remember about people who woke up in the morning having no idea that it was going to be their last day on earth. They ranged from wrong-place-wrong-time incidents such as the above to more mundane departures.
            In the latter category was a story that ran as a plain obituary. It was about a man in his late 70s who keeled over from a heart attack while dancing the Tennessee Two-Step at an RV park where he and his wife had stopped while on the road. You could ask for a few more years, but how many people die that happy?
            The news business puts stories like this in front of you all the time, and it’s hard to deal with any number of them without developing a true appreciation of the fragility and transience of life. I’d like to be able to claim that this awareness has changed me profoundly for the better, but it hasn’t. I still waste time, put off things I shouldn’t, and generally act as if I’m going to live forever. But in some small way, I do believe I’ve learned to live in the present more and the past and future less, and to enjoy the moment. That’s something.
            Oliver Wendell Holmes once said that his idea of the perfect life was to live in vigorous good health to the age of 90, then be shot to death by a jealous husband. He got the first part, but not the second. I’d like to catch that bullet, too, but the odds aren’t good. Tennessee two-step is a slightly better bet, but I’d have to learn the dance. So to myself and everyone else, I’d like to say have a nice day today. There’s no guarantee we’re going to get another one.
           
           

Friday, October 21, 2011

Jefferson's Rosebud


            In 1770 fire destroyed Shadwell, the boyhood home of Thomas Jefferson in Virginia. Burned, along with the building, were nearly all the documents relating to the first 27 years of Jefferson’s life. If there was a “Rosebud” in his childhood, we’ll never know.
            Because the Founding Fathers left such an extensive paper trail, the loss of Jefferson’s early documents may not seem like such a big deal. Up to that point he’d hardly been in the public eye, and the rest of his life was highly documented, both by his own documents and those produced by others, including Adams and Franklin, who dealt with him regularly.
            Still, you have to wonder. Jefferson was brilliant, dreamy and ethereal, possibly the most idealistic man ever elected president. He was pretty much that way from the time he entered the public record, but was there a formative event or events in his childhood that explains some of his character? Rosebud, the boyhood sled in Citizen Kane, was a pat explanation of the sort so beloved by Hollywood and so seldom seen in real life. A Rosebud for Jefferson is unlikely, but if you’re a historian or journalist, more information, even when inconclusive, is always better.
            Much of history lies behind metaphorical doors, closed to us forever. When the library at Alexandria burned, we lost much of our knowledge of the ancient world. Other manuscripts were lost during the Middle Ages, and still others survived, undiscovered for centuries in some monastery or other.
            Stacy Schiff, in her book Cleopatra, makes much of how little is known of her heroine’s life and times, and how much has to be inferred by the historian, weighing a range of sketchy and incomplete evidence. In such an instance, a historian’s work becomes largely that of an intuitive artist.
            Even in more recent times, there are things we just don’t know, and not always because a fire destroyed the evidence. What, for instance, were Lincoln’s religious beliefs, if any? He didn’t oblige us, as Ben Franklin did, by writing a detailed letter on the subject months before he died. Was Lincoln a non-churchgoing Christian? An atheist? A freethinker who conceived of a vague higher power and incorporated many of the teachings of Jesus into his philosophy of right living? I lean toward the third, but it’s impossible to prove or disprove any of those notions.
            We now live in an age where almost everything is photographed or recorded on video, and even so part of the visual record gets lost. In the early days of television, many of the shows, including the newscasts, weren’t saved. Incredibly, the people at the networks didn’t think they were valuable. If live, they often weren’t recorded, and if recorded, they were often erased and recorded over. It was big news a while back when a kinescope was found of Game 7 of the 1960 World Series between the Pittsburgh Pirates and the New York Yankees. It was one of the greatest baseball games ever played, and all we have is that one recording, which Bing Crosby, a part owner of the Pirates, ordered made because he was out of the country at the time.
            Nor will even the newest technology save everything. Ask Jamie Masada, owner of the Laugh Factory, a Southern California comedy club. For years he videotaped performances that included priceless footage of future stars in their formative years. To preserve the video he turned it over to an online backup storage company, which, in some unexplained way, lost 1,500 hours of it. Proving, I suppose, that it doesn’t take a fire to wipe out history. A speck of dust on a hard drive will suffice.