This blog is devoted to remembrances and essays on general topics, including literature and writing. It has evolved over time, and some older posts on this site might reflect a different perspective and purpose.

New posts on Wednesdays. Email wallacemike8@gmail.com

Tuesday, June 28, 2011

Making a Fetish of the Constitution


            The next time you hear someone rhapsodizing about the sanctity and brilliance of the U.S. Constitution, you might want to ask a simple question. Would anyone outside a lunatic asylum be saying such a thing if the South had won the Civil War — as it very nearly did?
            Of course not, and to ask that question is to raise a point that ought to be fairly obvious. The Constitution has survived as long as it has and served as well as it has because subsequent generations were able to interpret it in light of their times and change it when necessary.
            The outcome of the Civil War enabled passage of the 13th, 14th and 15th Amendments, which outlawed slavery, established the principle of equal protection under law and forbade racial discrimination in voting. It took another century to make that last one a reality, but let’s not quibble. Would a Constitution without those amendments (or the ones empowering women to vote and popular election of U.S. Senators) carry any moral weight today?
            There really isn’t any other way to say it — the Founders punted the issue of slavery to posterity. Historians of a sharply leftist bent tend to be critical of that decision, but it was probably the best they could have done. Lucky for our country and their reputations that Lincoln came along when he did.
            It is often assumed — wrongly, I believe — that the purpose of the Constitution was to limit what the federal government could do. More accurately, its purpose was to limit the damage that could be done by individuals, factions, and popular sentiment run riot. It was not so much about creating a narrow definition of the government’s role as it was about creating a process, full of checks, balances, and speed bumps, to ensure proper deliberation and a cooling-down period before government acted.
            James Madison, widely regarded as the Father of the Constitution, came to the convention of 1787 determined to get a document that would allow the federal government to override any state law that was deemed inconsistent with the new constitution. He had George Washington on his side, but the states-rights crowd wasn’t having any of it. Madison later flipped on that point when he realized that a strong central government would probably speed the end of slavery. Three decades after his death, slavery ended and the 14th Amendment pretty much incorporated his original idea.
            A liberal and broad interpretation over the years has also been essential to the continuing utility of the Constitution, and that sort of expansive approach has been aided by the vagueness of the document. Consider the following sentence from Alexander Hamilton’s “Report on Manufactures” to the U.S. Congress December 5, 1791. Arguing that federal support of manufacturers was permissible under the general welfare clause of the Constitution, Hamilton wrote:
            “The terms ‘general welfare’ were doubtless intended to signify more than was expressed or imported in those (powers of Congress) which preceded; otherwise numerous exigencies incident to the affairs of a nation would have been left without a provision.” In other words, Congress can do just about anything it deems necessary or desirable for the common good.
            Interestingly, when the Constitution was being debated, the vagueness of the general welfare clause was one of the things opponents pointed to as giving the new government too much power. They were right about the vagueness, but wrong about the too much; the process has pretty much kept that in check. But the anti-Federalists’ arguments live on today, most prominently vocalized by the contemporary Tea Partiers, who, ironically, profess their undying love for a Constitution that exists mostly in their imaginations.
           
           
           

Friday, June 24, 2011

The Sheer Awful Cussedness of Things


                  A statistic in the newspaper the other day called to mind a long-held fantasy of mine. More on the statistic later, but first, the fantasy.
                  It’s that whenever Congress takes up legislation that would affect the economic fortunes of most Americans, the congress members would be forced to watch the first 15 minutes of Pixar’s Up! and write a 100-word essay on what it shows about how real people live and save money.
                  (If you haven’t seen it, a young couple, upon marrying, begin to set aside money for a dream trip to South America. Every time they get a bit ahead, something happens: a tree falls on the house, the car crashes, and they have to raid the vacation fund. Finally, in their old age, the wife dies, and they still haven’t taken the trip.)
                  And if it’s that tough to save up for one crummy vacation, how can people be expected to save for retirement? The answer is, they can’t. Retirement planning is essentially a luxury for the 20-25 percent of Americans who make the most money. The other three quarters — unless they’re pathologically stingy or exceptionally lucky — will find that long-range financial planning is generally thwarted by exigent circumstances.
                  Even those who are fortunate enough to be in the top quarter can find themselves frustrated in their planning. Any number of things outside their control could break their bank. For instance:
                  • An extended period of unemployment can put anyone in a hole that will be tough to get out of.
                  • A long illness that puts someone out of work can have the same effect.
                  • If the above long illness results in significant medical costs uncovered by insurance, it can not only wipe out savings, but lead to bankruptcy as well.
                  • A child could develop a substance abuse problem or get in trouble with the law or both, requiring whacking fees for rehab or legal expenses.
                  • A divorce, which happens to around half of all people, can wreak as much havoc on a family’s finances as a serious illness or a long stretch of unemployment.
                  • An aging parent could require an extended period of care or treatment not covered by Medicare at the end of life.
                  • A family business that has been a prosperous concern for decades can be wiped out overnight by a combination of circumstances — for example, a severe recession and a flood.
                  That’s the sheer awful cussedness of things, or, if you prefer, that’s life. Whichever you prefer, it goes a long way toward explaining the statistic referred to earlier, which is:
                  Thirty-four percent of all working Americans say they have saved less than $100 for their retirement. You read that right; there are no missing commas or zeroes.
                  And I am guessing (the article didn’t say) that a similar percentage has saved no more than a modest nest egg, which is basically a contingency fund rather than an annuity.
                  If this is the reality, the importance of Social Security and Medicare in their present form becomes quickly apparent. There has to be something that’s there for people, no matter how long they live and how wisely they save and invest. Both programs need constant and vigorous refinement for efficiency and affordability, but the basic structure shouldn’t be messed with.
                  Of course the reason my fantasy is a fantasy is that some elected officials will never grasp that point. I don’t understand their thinking and can’t read their minds, but I have to wonder: Hasn’t anything bad and unexpected ever happened to them, their families or their friends? Or do they really believe that cutting taxes by a few hundred dollars a year will give people enough money to weather life’s storms? That’s an even bigger fantasy than mine.
                 
                 
                 
                 
                 
                 

Tuesday, June 21, 2011

Bringing Civilization to the Wild West


            Not too long ago I watched John Ford’s classic western The Man Who Shot Liberty Valance. It was released in 1962 — nearly 50 years ago — and shows its age.
            That’s not because it was made in glorious black and white or because it features John Wayne and James Stewart near the end of their long and storied careers. It’s not even because of the John Ford hokum and blarney (unimaginable in a movie today) which can test the patience of even the most sympathetic modern viewer.
            No, what dates the movie to the modern sensibility is its theme: The benefits of law and government, and in particular, the federal government. For all the talk about Hollywood being a viper’s nest of liberals and worse, it’s hard to see how a movie with that theme could be made today. Who wants to hear a message like that?
            A brief synopsis, if you don’t know the film. Idealistic young lawyer Ransom Stoddard (Stewart) arrives in the frontier town of Shinbone hoping to bring the power of the law to the wild west. He finds the town terrorized by sadistic thug Liberty Valance (Lee Marvin) who fears no one but the aging gunfighter Tom Doniphon (Wayne). Stoddard starts a school to teach reading, writing and government to the farmers and shopkeepers.
            Called into a showdown with Valance and facing certain death, Stoddard gets off a seemingly miraculous wrong-handed shot that leaves Valance dead in the middle of main street and the townspeople feeling empowered. Based on his reputation as the man who shot Liberty Valance, Stoddard leads Shinbone’s delegation to the territorial convention debating statehood and later goes on to become the state’s first governor, a United States Senator, and Ambassador to the Court of St. James. When he returns to Shinbone after an absence of years he sees the progress and prosperity that resulted from his political work and is ready to move back.
            Most directors would have made the movie 15-20 minutes shorter by leaving out a lot of the government stuff, but Ford dwells on it lovingly and sentimentally. To these people trying to settle a hard, wild land, the federal government wasn’t the problem — it was the solution.
            Joining the United States meant having law and order; it meant getting a state legislature that would enable all the settlers to mediate and resolve their issues without resorting to hired guns; it meant dams and irrigation projects that would allow farmers and communities to flourish.
            Of course the movie was fiction and of course there was more to the story, but that theme rings true. At the time Ford made it, John F. Kennedy was president and most people still believed in the country and its government. Americans could see the difference that government had made in their lives. Factory workers enjoyed good wages and benefits thanks to unions, which grew in power after the New Deal’s Wagner Act. The GI bill enabled many people to get an education and move into the professional classes. Social Security allowed old people to retire in a semblance of economic dignity.
            And it won’t do to say that government has become corrupt since then. It was always corrupt. At the time the film was set, in the late 1800s, large corporations controlled Congress even more than they do now. At the time the film was made, the Southern bloc of congressmen and senators was stifling civil rights while the just-departed president was warning of the growing power of a military-industrial complex.
            America worked through a lot of those problems because we believed in ourselves and believed in our ability to make the government work toward a more just society. If a movie celebrating that theme seems passé, it’s a sad commentary on our times.
           

Friday, June 17, 2011

Point, Click, Post — Oops!


            Imagine, for a moment, that 30 years ago a passionate young woman had seen a handsome congressman make a fiery speech on network TV and had sent him an exuberant and effusive letter of support. Imagine, further, that said congressman, after reading the letter, decided that the proper way for him to acknowledge her kind words would be to send a close-up photograph of his underwear.
            It would have been an undertaking. He would have had to find a camera, which people didn’t readily keep at hand in those days; probably get some film to put in it; shoot the entire roll of film (wouldn’t want to be wasteful), after carefully focusing and calculating the F-stop and shutter speed; rewind the film and remove it from the camera; take it to the drugstore; wait overnight for it to be developed; select one of the prints; dictate a cover letter to his secretary; put the picture in with the letter and mail it.
            And somewhere in this long and tedious process you would like to believe that a light bulb would come on inside the congressman’s head and he would say, “You know, maybe this isn’t such a good idea.”
            Perhaps not; politicians have been a historically randy lot. Our Founding Fathers were not, as a group, very good at restraining their sexual appetites. They weren’t Tweeters (though Ben Franklin would have been a great one), but plenty of their hand-written correspondence survives to this day and can be readily viewed by any scholar who needs confirmation on the point of their morals.
            The recent epidemic of digital dissemination of personal erotica is less about lust, as most past scandals have been, than it is about exhibitionism, a character defect greatly encouraged by the internet and by social media in particular.  “Look at my underwear” is merely an aberrant (and probably more entertaining) variation of the “Look at my family” or “Look at my pet” photo that people daily inflict on their Facebook friends.
We now have the tools to call attention to ourselves instantaneously, without stopping to think about whether we really should. And when just about every member of Congress carries an iPhone or Blackberry, there’s no protective staff member standing in the way of the execution of his worst impulses. I say “his,” because, to this point, women in politics have been more circumspect.
            One of the wisest observations about this phenomenon was buried deep inside a recent New York Times story, and it bears quoting in full:
            “The digital revolution for a lot of people in politics is like a high school party where they experience alcohol for the first time,” said Mark McKinnon, former media adviser to George W. Bush and John McCain. “They get very excited, lose their inhibitions, say and do things they shouldn’t, and realize too late they’ve made complete idiots of themselves. And then can’t undo it.”
He continued: “Digital politics invites and rewards quick triggers and does not reward thoughtful reflection or careful judgment. And so it is no surprise that we see so many politicians fail to clear their holsters before they drop the hammer.”
Given that Twitter, Facebook, and the internet aren’t going anywhere, we should appreciate that the danger they pose is not so much making us immoral as making it too easy for us to be juvenile. The take-home message, and not just for politicians, is: Be careful. Be very careful. If you’re not, you could end up acting like a teenager in front of the whole world. Remember how bad that was the first time around, when only the people at your high school were watching?


Tuesday, June 14, 2011

The Good Doctor and the Good Mechanic


                  A few years after we bought Linda’s Volvo station wagon, the engine died noiselessly one day when she stopped at an intersection. It wouldn’t start again and had to be towed to the dealer. Where it promptly started up perfectly on the first try.
                  They put it in the service bay, hooked it up to the latest computer equipment and ran every diagnostic test they could think of. The verdict: The computer doesn’t show anything wrong, so there’s nothing to fix.
                  Several days later it died at an intersection again and had to be towed to the dealer again. Where it promptly started up perfectly on the first try.
                  They put it in the service bay, hooked it up to the latest computer equipment and ran every diagnostic test they could think of. The verdict: The computer doesn’t show anything wrong, so there’s nothing to fix.
                  By this time we were beyond frustration. Not only had we paid a few hundred dollars for two diagnoses that took us nowhere, but we were also getting snarky messages from triple-A about how we should take better care of our cars so they don’t have to be towed all the time.
                  Then we took the Volvo in to our regular German auto mechanic for a routine oil change and mentioned the problem of the stalling and starting. He had no computerized equipment for diagnosing a Volvo, but he had something better — a feel for cars and a functioning brain.
                  “You know,” he said, “I had a Volvo station wagon in with a problem like that a couple of years ago, and it turned out to be a defective thingumy-jingumy.” (Sorry, but that’s as far as my specific recall of automotive detail goes.) “I can’t promise you that’s what it is, but if you want, we could try replacing it and see if that takes care of the problem.”
                  Based on years of good experience with him, we took his suggestion and replaced the thingumy-jingumy. The problem went away.
                  Which reminds me of the time, a few years ago, when I went in for a physical and one of the blood-test results came back slightly elevated. It was the last physical with a doctor I’d had for years, who was retiring. The old doc looked at the results and philosophically opined that given my medical history and general health level it was nothing to worry about yet and that his counsel would be to watch and wait.
                  A year later I went in for a physical with the new, and much younger, doctor assigned to me by the HMO. The same tests came back at the same, slightly elevated level. The young doctor took it very, very seriously and put me through more tests and an ultrasound to try to get to the bottom of it, but the tests didn’t provide any clues.
                  Then we switched health plans, and I had to get a new doctor. I wondered what this one would say about the elevated test levels, but I never found out. When the blood tests came back this time, the previously elevated levels had dropped to normal without any action on my part or that of the medical establishment. The old doctor was vindicated.
                  And the moral of the story is: Good diagnosis is often as much about experience, intuition and artistry as it is about science and technology. That’s true for physicians and auto mechanics alike. They’re both in the same business, really — trying to keep a complex and perishable piece of machinery running as well as possible for as long as possible.

Friday, June 10, 2011

Old-School Print Guy Reads E-Book


            I lost my e-book virginity this past weekend to a woman I met online. She’s been dead a while, too, if that makes the story any more marketable.
            Her name is Anne Austin, and in 1931 she published a mystery novel titled Murder at Bridge, about a woman shot to death in a Midwestern city while a number of her friends are over for a friendly round of cards. I wouldn’t recommend that you run out and buy this book, but I did because it was 99 cents at the Kindle store, and there was no sense in sinking too much money into an e-book library until I’d read an entire book that way.
            When the first e-readers came out a few years ago, Linda asked if I wanted one for Christmas, and my response was a less polite and more profane variation of “Over my dead body.” I am an old-school print guy — English major in college, newspaperman for two decades. I love the feel of a book in my hands and the look of it on the shelf. I love paper and ink. After years of trying, I still can’t get used to reading a newspaper online. It just doesn’t seem like the real thing, and I was sure an e-book would feel equally wrong.
            A few weeks ago I bought an iPad, primarily as a way of taking my business, or a significant piece of it anyway, easily with me when I leave the office. Since it can double as an e-reader, and since I had some money left on an Amazon gift card, I reluctantly decided to try an e-book just one time. As Voltaire observed in another context: Once, a philosopher; twice, a pervert.
            It took a while to get the iPad connected to Amazon and the gift card, but once done I searched the Kindle store for mysteries and it kicked up a few of the classic era that cost next to nothing, probably because the copyright had expired. I chose Ms. Austin’s tome and another by two Edwardian authors I’d never heard of.
            Saturday afternoon, iPad charged to 100 percent, I sat down to read the book and was pleasantly surprised. Not by the book — I deduced the killer less than halfway through — but by the experience of reading on the iPad.
            Several years ago, on one of the CSPAN Sunday book shows, they were interviewing Tom Wolfe, who was railing about the experience of reading on computers. It took humanity centuries to progress from scrolls to the book, he said, and now computers have taken us backward to the scroll. The e-reader takes us forward to the book again, with distinct pages, easily turned with the tap of a finger, and much as I hate to say so, I liked it.
            The advantages: Bigger, easier-to-read type with a bright background; can hold with one hand or lean against a leg while holding a cup of tea in the other hand; nearly impossible to lose your place.
            Disadvantages: No page numbers; harder to flip back to an earlier page with a diagram; probably have to take a recharge break for the battery if reading a longer book (this mystery, which I would estimate at 225-250 standard pages drained 65 percent of the iPad’s power by the time I was finished).
            All in all, not bad. Three years ago I would never have believed I could say such a thing.  

Tuesday, June 7, 2011

The Republican Dr. Strangelove Syndrome


            When Dwight Eisenhower was elected president in 1952 — the first Republican in two decades to hold that position — he had some blunt advice for the hotheads in his party.
            Don’t even think about trying to abolish the New Deal, he told them. It’s popular and it’s here to stay. Our job is to manage it better. His party took the advice for a while, but like Dr. Strangelove, it keeps trying to hit that mutual destruction button, attacking popular programs and blowing itself up in the process.
            Social Security (a New Deal program) and Medicare (a Great Society program) have been around for decades, and they aren’t going anywhere for a simple reason: They work, and probably 95 percent of the public understands, at some level, that it will at some point be helped by them. Forty years from now, President Obama’s health care plan will almost surely enjoy the same support.
            When Congressman Ryan of Wisconsin introduced his budget proposal, which called for turning Medicare into a voucher program, he was widely praised for being serious about its problems. That any responsible commentator would say such a thing is a sad commentary on the dysfunctionality of the punditry.
            Ryan’s plan wasn’t serious at all because any attempt to reform Medicare or Social Security is a political non-starter unless it accepts the basic premise of both programs and attempts reform within that premise.
            For Medicare that means that it will continue to be a universal, single-payer program that will provide basic care for people from the time they turn 65 until the time they die.
            For Social Security that means that it will continue to be a program that pays out a guaranteed defined benefit from a specified retirement age until death.
            Agree on that, and it becomes possible to undertake real reform with an eye toward fixing the two programs so they will remain in decent financial health and fulfill their mission. Medicare, for example, could be substantially reformed by looking at the ways other countries provide universal care and stealing their ideas. It would be far better to do that than to reinvent a system that has got the practice of delivering service down so well.
            It’s hard for Republicans to take this approach because, I am convinced, they fundamentally don’t like Social Security and Medicare. But the foundation of effective political action is accepting what you can’t change and working to change what you can. Persisting in an attempt to undo a popular and widely accepted program is a recipe for failure and an impediment to the reform that could and should be made.
            President Obama grasped this point perfectly in putting forward his health-care bill. A number of Democratic true-believers were incensed that he never proposed a universal, single-payer program. Like Congressman Ryan and his ideological colleagues, they failed to grasp that the perfect (from their viewpoint) was the enemy of the good.
            Obama understood that most middle-class voters already have a health plan, and that most are reasonably happy with it because they haven’t had a problem with it — yet. Anything that threatened to change people’s existing coverage would have been demagogued to a speedy death. In my view, what he came up with was not as good as a single-payer system would have been, but it got the country a near-universal health care plan, which is more than Truman, Kennedy, Johnson, Carter and Clinton were able to do. History will look kindly upon that.
           
           

Friday, June 3, 2011

Nothing You Can Put Your Finger On


            A couple of news items in recent weeks served as a reminder of how much the everyday world is changing.
            One was a New York Times story reporting that, for the first time, Amazon.com is selling more digital books than physical, dead-tree books. The article was quick to point out that Amazon is but a small share of the market and that book sales are overwhelmingly made up of actual books. Nonetheless, the arc is clear.
            It was also reported, in another story on another day, that Google is working feverishly on a system to put credit cards and ATM cards in smart phones so people can make everyday purchases. If that comes to pass (and who doubts that it will?), it could mean the end not only of credit cards, but cash and wallets as well. After all, the only reason for a wallet would be to hold a driver’s license, and that will probably be replaced by something digital too.
            The movement away from the tangible to the digital has occurred with breathtaking speed. Only 20 years ago, people still wrote letters. Does anyone now? I wonder from time to time how historians of this era will cope, having to dig up the e-mails of noteworthy people, rather than having actual letters. How many of today’s e-mails will be retrievable by the computers (or whatever they evolve into) of 2111. And if they are, who will have the time to go through all the daily business dreck that comprises most people’s e-mail output?
            Even an old-school print guy like me has to some degree embraced the digital improvements. I probably send three or four print letters a year (it used to be nearly that many in a week), and I find myself getting irritated if someone ahead of me in the supermarket line is writing a check and slowing things down. But the really intriguing question is what the unexpected effects of technological change will be. There are always a few.
            When the newspaper I worked for converted from typewriters and pencils to computers in the early 1980s, they thought they’d save money by not having to have proofreaders, since reporters and editors would be, in effect,  inputting the copy, rather than having it reset by someone in the print shop. We quickly found that so many mistakes were getting through that way that we had to hire another editor (better paid than a proofreader) to give stories a second reading and fix the mistakes that were created while fixing other mistakes. We found that even the sharpest editors missed more when they were looking at a story on the computer, rather than marking up a typewritten sheet of paper with a pencil.
            The replacement of hard cash by bank deposits and the provision of easy credit by banks and card companies has certainly changed the way people look at money. As the recent housing and mortgage crisis amply demonstrated, both borrowers and lenders tend to start forgetting it’s real money and begin seeing it as numbers on a spreadsheet. With all the personnel turnover in the financial industry, the personal connection, as well as the physical connection, to money borrowed and loaned has all but evaporated.
            There are some good books out there about the housing and financial crash of 2008. I’m thinking of ordering the e-editions from Amazon, paying with a credit card, and reading them on the iPad I just bought.