comment 0

Living in the Cloud

     There was a time when my phone rang almost constantly while I was at home, and my answering machine filled up with call-backs when I wasn’t — and I KNEW all these people and wanted to talk to most of them.  I liked talking to them. That was then. Now my phone hardly rings at all, and when it does, I generally don’t even answer it. So, what happened between then and now?

     Automated technology happened, which better enabled those annoying telemarketers to fill up your phone machine with messages during the day, and to interrupt your restful evenings with unsolicited pitches at home, especially during dinner-time.  In order to facilitate compliance with the Telephone Consumer Protection Act of 1991, the Do-Not-Call Implementation Act of 2003 was signed into law by George W. Bush. The FTC’s Do-Not-Call registry list began that June. We all started signing up, but it wasn’t always easy to do. Success of the effort was sporadic at best, and temporary, since mechanisms for automated dialers and access to number directories expanded faster than the registry’s ability to keep up. 

     So then what happened? Mobile devices happened. First there were pagers and beepers — important tools of communication not only for emergency responders and those in vital need-to-reach positions, but soon enough for the self-styled-self-important everywhere, including drug dealers. High schools made “no pagers” part of student dress codes. (I know, I was teaching.) BlackBerry introduced its e-mail pager in 1999, then its phone in 2002, and CEOs everywhere rejoiced. President Obama loved his BlackBerry so much that, in spite of security concerns, he insisted before his inauguration that staff would have to “…pry it out of my hands.” (They didn’t and he kept it until 2016.)

     Cloud computing happened next in 2006, the I-Phone was introduced in 2007, and the rest, as they say, is history. Not surprisingly, congress passed the Do-Not-Call Improvement Act of 2007, designed to extend the reach of registry regulations from landlines to fax machines to cell phones. It took effect in February, 2008, but it was too little too late. By then we were all “living in the cloud.”  Social media and digital communication have defied regulation and dissolved personal restraint ever since. We now find ourselves constantly assaulted by telemarketers, text messages, and twitter tantrums on our cell phones, and incessant robo calls and messages on our landlines (if we still have them). 

     And that’s what happened between then and now.

     Now don’t get me wrong: I love my I-Phone, my I-Pad, my I-Mac, even my outdated I-Pod. These products have offered I-Me safety, convenience, accessibility, and happiness. For us baby-boomers, the first generation to be generally proficient in technology and digital communications, electronic devices will prove to be even more of a godsend as we grow older. Loss of physical mobility will no longer mean a complete loss of independence and isolation in our own homes. We can (and do) stay connected to the world and with friends and family far away; we can (and do) manage routine chores such as shopping, banking and paying bills; we can (and do) entertain ourselves through movies, music, videos and books; and we can (and do) continue to learn and grow through on-line classes and tutorials. Why, we can even run a home-based business! 

     All this electronic convenience comes at a cost, however, for young and old alike.  Ironically, while our computers and smart phones were supposed to enhance communication, the loss of the human touch in the quest for efficiency seems to have eroded it. I watch young people, ostensibly out on a dinner date, sitting with their faces to their cell phones instead of to each other; I see people in church spending much of the service texting on their smartphones; I find that business agreements are routinely done through e-mail, which in turn serves as the contract itself, sans signature. No one actually talks on the phone anymore, they don’t even leave voice mail, preferring to text instead. We’ve all lost our voices, literally and figuratively, in the abrupt, time-sensitive demands of simply “messaging.”

     Psychologists and sociologists have long warned of the lack of development of mature language and interpersonal skills among those who confine themselves in a digital world, but that’s just the least of it. The explosion of social media has obliterated behavioral norms, invaded personal privacy, promulgated false information, encouraged bullies and voyeurs, and rendered the nuances of artful conversation and the civility of public discourse totally irrelevant. After all, it’s so much easier to be threatening, intrusive and obnoxious, even downright offensive, when you’re not face-to-face.

     These days, rather than enhancing communication, it seems that living in the cloud has brought us into the fog — the fog of war between multiple factions: classes and cultures, men and women, liberals and conservatives, religious and secular, and on and on and on. We’re not only not communicating, many of us are hardly speaking to members of our own family! 

    Now as we enter  the ridiculously premature campaign for the 2020 elections, we will be back to where we began, back to the intrusiveness of still more robo calls and messages from political organizations promoting candidates, fund-raising, and polling  ALL of which are exempt from the Do-Not-Call rules. 

     Oh well, the registry never really worked anyway…

comment 0

Memorial Day

      Most Americans know that Memorial Day honors the men and women who died while serving in the U.S. military. What people might not realize, however, is that Memorial Day had its beginnings right after the Civil War. By 1866,  many towns and cities had already begun paying tribute to some of the 620,000 soldiers who had fallen (roughly two percent of the population at the time, and more lives lost than in any other war since) by decorating their graves with flowers in the springtime and remembering them with prayers and ceremonies. In 1868, General John Logan, the leader of an organization of Northern Civil War veterans, called for a national day of remembrance; it was named Decoration Day and set on the 30th of May. 

     Decoration Day continued to honor the Civl War dead into the 20th century, but as America became embroiled in more wars, WWI and WWII especially, the holiday evolved to commemorate all American military who died in all wars. In 1968, Congress passed the Uniform Monday Holiday Act in order to create a three-day weekend for federal employees. The last Monday in May was designated. That law went into effect in 1971 and Memorial Day officially became the federal holiday that it is today.

     UN-officially, Memorial Day is also considered the start of summer for millions of American school children and their families.  In many states, schools are out by the end of May, the weather is warm enough to open swimming pools, and traffic begins to build on roads headed to the beach — even in New England where it is still most likely too cold to swim. Monarch butterflies and migratory birds — songbirds and shorebirds, including Florida’s “snowbirds”— are all headed home.  Patios and decks get cleaned off, gas grills get fired up, and automobiles get checked out and tuned up for those weekend trips. 

     When I was a kid growing up in South Texas, Memorial Day ushered in a full three-months of summer vacation, days to spend as I pleased. And I was pleased to spend many of those days in and around water, down in Port Aransas or Rockport, where my friends’ families had bay houses, or at swimming pools, where friends had club memberships or pools at home. I was hardly an Olympic swimmer, mind you, but I loved swimming and diving and playing water volleyball. Everybody did. Even when my Mother and I took little summer vacations, we spent them at a hotel with a pool, on the Gulf in Corpus or Galveston or down at South Padre. In an era before universal air conditioning, being in or near the water was the only sure way to combat the oppressive Texas heat.

     By the time I was in high school, I had learned to water ski and had even taken up surfboarding (if you can believe it) on those “pint-sized pipelines” out on Mustang Island. It was the 60’s, after all, and we did live right along a Coast, even if it wasn’t in California.  “Good Vibrations” was the soundtrack of our lives. We watched all those beach party films, saw ourselves as Annette Funicello or Frankie Avalon, and promptly divided everyone we knew into two camps: the bikers and the surfers. Any doubt about which group was the cooler? 

     Of course, as I got older and moved on to college, I still wore my hair long, still grooved to the music of the Beach Boys and The Mamas and the Papas, and still made sure that I had at least two swim suit/cover-up sets at the ready for the summer, but I also began to realize that life was not always a beach and that lazy days lolling around the pool or “laying out” on the sand were harder to come by. I got summer jobs or took summer classes and, as the back-to-school days of August approached, I often lamented over how few days that year I had actually spent in a pool or on a beach. Gradually, Memorial Day came to signify a gateway day to bigger events, to college graduations, June weddings, and emerging adulthood into “the real world.”         

     I can’t say that I didn’t appreciate the youthful summer fun I had while I was having it, but as with most things, I probably also took much for granted — certainly the physical prowess that enabled me to do back flips off the diving board or to slalom on one ski. But memories die hard. When we retired back to South Texas ten years ago, the one thing I was adamant about was that we would have our own swimming pool in the back yard. And so we do, and it is lovely. My husband swims almost every afternoon from early April to late October. 

     Don’t ask me how often I swim these days, but the sounds of the waterfall are peaceful and soothing as I sit on the patio and read.

comment 0

Acknowledgements

Every book has an Acknowledgements page intended to make special mention of those who have helped the author complete the written project in some significant way. Acknowledgements may cite editorial or professional support, research or interview cooperation, informational or inspirational sources, or simply emotional and moral encouragement. I have had such pages in the front of my own books; the list of acknowledgements may be long or short, detailed or general, depending on the breadth of the work. A major book of non-fiction or an important academic treatise might have involved numerous interviews and multiple media sources, for example, while a slim volume of poetry might only need to identify those who have helped sustain the poet’s creative efforts.

     If you think of a life as a book being written while lived, then obviously the list of acknowledgements the “author” needs to make grows steadily over time.  Often those acknowledgements are made at pivotal points in the “author’s” life, on important occasions such as graduations, weddings, awards, or retirements — but only if the person is wise enough, and humble enough, to recognize that he/she couldn’t even survive all alone, much less achieve anything.  

     Sadly, sometimes acknowledgements are only made by others on behalf of the author in the form of eulogies or obituaries when the story is done. Frankly, that’s too little too late.  I have tried along the way to thank those people who have profoundly influenced my life and work, those mentors, colleagues, family and friends who have encouraged me and helped me to do my best and be my best self.  It isn’t always easy to see how significant the contributions of others are to your life at the time, but with age comes the clarity of hindsight and the opportunity to address any oversights. 

     Now stay with me here, because I’m about to make a segue that might not seem to make sense.  Since I’ve been in the throes of “spring cleaning” (see April 5 post), I’ve realized that  acknowledgement is a necessary part of clearing out and trying to simplify. Those things — most especially books, but also other objects— that have contributed to who I am, what I know, and what I am able to do also deserve some recognition, even as their continued presence on my library shelves must be reevaluated. As I sort through my 1,500+ volumes and prepare my donation for the local library, I have nothing but gratitude for what these very real authors and their very real books have taught me. Next to the people in my life, books have been, and are, my closest companions. 

     I have, for instance, built a collection of gorgeous books on floral arranging, home decorating, tabletop and entertaining, not to mention cooking, clothing, tailoring, quilting, art and design. From David Tutera to Bunny Williams to Martha Stewart to Jacques Pepín, I have them all and I have learned from them. But I have to admit that the need for many of them has passed; at this stage, I have either mastered the skills they contain or no longer need to. 

     Inevitably, part of acknowledging these authors’ contributions to my life is to also acknowledge that certain periods in my life are over. I am no longer hosting elaborate dinner parties, no longer entering flower shows, no longer upholstering couches or making draperies. And so it is also with the corresponding accouterments of those times: the silver coffee service or the multiple sets of china and glassware; the numerous vases and containers for floral displays; the closet full of linens and serving dishes, the bins full of Waverly™ home fabrics. Of course, I appreciate the memories these things evoke, memories of decorating a brand new house or of fancy dinner parties with delightful friends, but these are not events that will likely be repeated. 

     Cleaning out a clothes closet brings about similar realizations. Besides the fact that some of my favorite garments no longer fit, I am now retired in Southwest Texas, not working in the Northeast. I don’t need heavy sweaters or tailored suits, and I certainly don’t need formal attire!. Except when I travel, my  everyday wardrobe consists mostly of yoga pants and t-shirts, albeit in those New York shades of grey and black. I will forever credit the great 5th Avenue department stores, Lord & Taylor, B. Altman & Co., and Saks Fifth Avenue, for teaching me how to “invest” in good clothes and achieve a timeless style, and I still associate particular garments with certain events, much as one does when hearing an “oldie but goodie” song on the radio. But once again, the needs and occasions for which those clothes were acquired are no longer part of my life.

     I could continue through a whole inventory of housewares and keepsakes and tchotchkes that must be reevaluated in my current cleaning frenzy, but you see where I’m going by now. Aging is a fact of life, and aging well is about acknowledging the past with gratitude, accepting the future with hope, and adjusting present expectations accordingly, whether those expectations involve physical limitations, emotional baggage, or lifestyle choices. 

     Actress-turned-philosopher Brooke Shields has been quoted as saying, “I persevere, and not just blindly. I take the best, get rid of the rest, and move on …”  True enough, but the less encumbered you are with “stuff” of any kind, the easier it is to move on.     

comment 0

Strolling Savannah

Said to be America’s first planned city, Savannah was initially laid out in a series of grids marked by wide streets and lush public squares, 24 of them to be exact, 22 of which still exist and still invite leisurely strollers and sitting people-watchers today.  General James Oglethorpe, who landed on a bluff along the Savannah River in 1733, founded Georgia, named it after King George II, and thus established the 13th and last of the original colonies in the New World. Savannah was Georgia’s first city and is still, arguably, its most beloved. 

     Savannah was thrust, rather abruptly, into the public spotlight in 1994 with the publication of John Berendt’s best-selling non-fiction narrative, Midnight In the Garden of Good and Evil, later made into a movie starring Kevin Spacey and Jude Law. The murder of young ne’er-do-well Danny Hansford by wealthy art/antiques dealer Jim Williams in 1981 is the central event of the of the story, but the scandalous intrigues, eccentric characters, and romantic settings tell a parallel, equally-compelling story of Savannah herself.

     Berendt, a columnist for Esquire, spent eight years living between New York and Savannah, so he knew the city about as well as any outsider could. Ultimately, Midnight is a commentary on class and wealth, history and tradition, gentility and decay in a proudly Southern city. It is recounted with a talented writer’s realistic eye and a part-time resident’s understanding heart.  As a non-fiction novel, Midnight would do Truman Capote proud; it is a book I greatly admire and one I wish I had written. 

     But, I didn’t, so I did the next best thing: recently, I finally visited Savannah for the first time to inspect the exquisitely restored 18th and 19th century homes in the historic district (a restoration project in which Jim Williams was a leading force), to see the Mercer House on Monterrey Square where Williams lived and the murder took place, to gaze upon the serene statue of the “Bird Girl” that stood in Bonaventure Cemetery depicted on the cover of the book (she is now housed in the Telfair Museum due to hoards of tourists trampling the cemetery), and to sense the “ghostly presences” darting among moss-hung oaks in the morning mist. It’s all there just as described by Berendt and still attracting throngs of visitors every year, along with carriage tours and riverboat rides, art museums and historic landmarks, shrimp and grits and fried green tomatoes — and yes, Paula Deen’s The Lady and Sons Restaurant. 

     Rosemary Daniell, a Savannah writer, has pointed out that despite the city’s beauty, it has a dark side and that “…the present runs concurrently with the past” (quoted in After Midnight in the Garden of Good and Evil by Marilyn J. Bardsley). Having lived in Memphis myself for five years, I’d say that description easily applies to other cities in the deep South, cities such as New Orleans or Charleston, Richmond or Montgomery.  They display Southern charm and warm hospitality on the surface, even as their societies remain insular and immutable. Attitudes and prejudices have long, deep roots in old family trees and create a general distrust of outsiders that impedes fundamental social change even today. 

     Some variation of closed communities exist almost everywhere, of course, whether it’s in clubs and organizations, or schools and neighborhoods, but living in Memphis made me realize that Texans are not really Southerners in the generally understood way. First of all, Texas was late, last, and reluctant to join the Southern cause in 1861; its pro-Union Governor Sam Houston refused to pledge allegiance to the Confederacy. Much of the state was still a frontier during the Civil War years, and so only East Texas, where cotton and sugar cane were cultivated, was commonly considered part of the deep South, and a far westernmost  boundary of it at that.  Secondly, people in the rest of the State, including my own German immigrant ancestors, were busy establishing local law, developing land and cattle, and trying to acquire the renowned ranching skills of Mexican and Spanish caballeros. Thus, those Texans, along with the Native Americans who still inhabited the area, identified more with the culture and traditions of the West/Southwest than with those of the deep South. Most Texans still do. 

     And then, finally, came the discovery of oil in 1901 — and with it came all the roustabouts and wildcatters and fortune hunters that forever dispelled any pretensions of Southern charm and gentility.  Ironically, Spindletop, that first gusher, was in East Texas near Beaumont.  For a vivd picture of how oil and cattle came to dominate the Texas culture, read Edna Ferber’s Giant (or see the 1956 movie starring Elizabeth Taylor and Rock Hudson filmed near Marfa in West Texas). 

     Lest all this history and literature begin to bore, let me explain that for me part of the pleasure of travel is the anticipation of the trip, including the reading and “studying up” in preparation. Granted, books and films and travelogs can easily rely on clichés and promote stereotypes, and someone else’s view/review of a place can never serve as a de facto endorsement or dismissal, any more than a movie review or a restaurant critique can guarantee the appropriateness of that choice for a particular patron. Nevertheless, I try to learn as much as I can about a place in advance because, in spite of the delights of serendipity, I simply cannot imagine arriving cold without any historical or cultural context to enhance my understanding of where I am and what it means. I guess that’s the teacher in me, or the writer, or the skeptic…

     So, did I enjoy the trip to Savannah? Did I have memorable meals, get inspired by great art and architecture, and feel calmed by the gracious romance of those shady squares? Did the sight of riverboats down at the wharf make me smile, and did I even experience a bit of “serendipity” on an unexpected visit to Tybee Island?  Yes, emphatically, I did, and I enjoyed it all for several days while staying at the Planters Inn (where John Wesley’s parsonage stood in 1736). My review: Savannah is a great place to visit.

     But I wouldn’t want to live there.

comment 0

Spring Cleaning

    Spring has come in fits n’ starts this year, as have I. Yes, the bluebonnets are up and I have bouquets of them on my table, though they are already fading here in South Texas and have yet to fully burst forth in the Hill Country. Temperatures all last week have been in the 40s – 50s at night, and maybe 70 during the day, but yesterday we hit 90. Such are the vagaries of  Texas weather. Pollen is the one sure thing that is blossoming big time, however, and I have the sneezes and watery eyes to prove it.

     While allergies may make some people miserable, most folks still eagerly await the warmth and sunshine of spring and welcome it as a time to refresh and renew, to clean up, clear out and begin again. This is especially true in areas where long winters of six months or more breed a serious case of cabin fever.  As an astronomical season, spring is based the natural rotations of the earth around the sun with two solstices and two equinoxes. Yet, while people have used natural phenomena as markers of time for thousands of years, the vernal equinox associated with the beginning of spring in the Northern hemisphere (around March 21) doesn’t necessarily coincide with spring-like weather.

     The origin of “spring” as a meteorological, and therefore metaphoric, season is easier to understand but harder to pinpoint, based as it is on climate and temperature cycles. In Western Europe where the Catholic Church held sway for centuries, spring was originally called Lent to indicate the days leading up to Easter Sunday. Sometime in the Middle Ages, the period began to be called “springing time,” because of all the plants and flowers “springing up” from the ground. A while later, the season became known as “spring time,” and then, finally, just “spring.”

     It’s interesting to note now how many cultures in the world, regardless of their religious beliefs or geographic locations, celebrate a “spring-like” season of rebirth with cleansing rituals and traditions. One of them, the custom of “spring cleaning,” can be traced back to ancient Jewish practices of thoroughly cleaning the house in anticipation of Passover (itself a forerunner of Lent). Similar traditions are found in the Iranian Nowruz, the Persian new year, which falls on the first day of spring. The practice of khooneh tekouni, which literally means “shaking the house,” is still honored today. In some cultures, “spring” cleaning and the urge to refresh and renew comes at the end of the year, which could actually be in winter or summer, depending on their calendar. 

     I have been particularly ruthless about clearing out and cleaning up this year. Part of that impulse, of course, has been occasioned by the cartons of papers and documents, family photos and records, and personal possessions and keepsakes of my Mother’s that now are deposited in a guest room, in a closet and in the garage. Not only am I still trying to wrap up her affairs, I am now trying to distribute, dispense, and dispose of the leftovers of her life, much of which I personally value, but simply don’t know how to accommodate. 

     Integration of belongings from one generation to the next is a tedious process, and a sad one.  As I go along trying to marry some of her things (photos, knick-knacks, special treasures, pieces of furniture) with my own, I inevitably find myself overwhelmed and irritated with the clutter I myself have accumulated. I open drawers and things fall out; I go to shelves and have no room left; I look in closets and am met with chaos. Thus, a cleaning frenzy ensues — one not entirely due to the season, nor one entirely based on hope for the future. Not only am I feeling  encumbered by all my own “stuff” right now, but I am also facing the future reality that our one bachelor son doesn’t want to be encumbered by it either. 

     In 1986, comedian and social critic George Carlin did a routine about “stuff” that still resonates with me. It was masterful, as so much of his very original work was, because it satirized Americans’ lust for big money and high style, which so dominated the 1980s,  through the simple, immediately familiar descriptions of our own everyday relationships with our own everyday “stuff.” As Carlin made plain, you didn’t have to be one of the Carringtons on Dynasty to be guilty of conspicuous consumption: You buy a house to “have a place for your stuff.”  Pretty soon you need another, bigger place to “keep your stuff while you go out to get more stuff.” Then a whole storage and security industry develops based on a need for “keeping an eye on your stuff.” Ironically, Carlin performed this routine for Comic Relief in a charity event to combat poverty. (Find it on You Tube.)

     Tomorrow my community is having a giant disposal day, with shredders for papers and documents, bins for outdated prescriptions and medications, and dumpsters for unusable appliances, furniture and other household items that regular garbage collection won’t take. That bachelor son of ours is coming over with his big pick-up truck to help us haul our “stuff” over to the drop off site. This isn’t everything that I need to get rid of, but it’s a start. 

     Fits n’starts — that’s what this year’s spring, and spring cleaning, is all about.

comment 0

Ash Wednesday

      The season of Lent is upon us and with it the rituals of repentance: fasting, abstinence, and acts of self-denial; ashes, palms, and Stations of the Cross; mournful music, daily devotionals, and confession of sins. The First Council of Nicea (325 A.D.) established the 40 days of Lent beginning on Ash Wednesday as a period of fasting and sacrifice in preparation for Easter.  While specific rites and rituals may differ among Christian denominations, some form of Lenten practice is traditionally observed not only by the Roman Catholic Church, but also by Eastern Orthodox, Anglican, Episcopalian, Lutheran, Methodist, Presbyterian and some Baptist churches. Any particular “sackcloth and ashes” practices of Lent have been created by churchmen and have no specific foundations in Scripture, though a biblical basis for fasting and the denial of worldly temptations is usually attributed to the gospel accounts of the 40 days and 40 nights Jesus spent in the desert in spiritual preparation for his public life and ultimate sacrifice.

     The desert as a landscape of a parched spirit is a recurring metaphor in both the Old and the New Testaments.  Paradoxically, it becomes both a place of threatening isolation and death, and a place of spiritual encounter and renewal: Abraham casts Hagar and her son, Ishmael, out into the desert; Moses spends 40 years in the desert before returning to Egypt to free his people, and then spends another 40 wandering there with the Israelites; the people of Judah are exiled from Babylon, only to be called back by God and ordered to return through the wasteland from which they’d come; Mary and Joseph take the baby Jesus and flee into the desert to avoid Herod’s wrath; Jesus grows up in the small town of Nazareth, literally in the middle of the desert; John the Baptist preaches and converts in the desert, and baptizes Jesus there. The references and subtleties that underlie all these narratives are made stronger and more meaningful by virtue of their settings; the stories and images taken together are perfectly suited to the mood and the meaning of Lent.

     The writer in me wants to believe that this well-developed biblical metaphor was no accident for the early authors of Scripture, even though they wrote at different times over many different years. Yes, it’s true that almost all of the Holy Land where these stories take place is, in fact, a desert wilderness.  If you’ve been to the Sahara, the Sinai, the  Wadi Rum or the Arabian Desert and seen the vast expanses of nothingness there today, then you have not only seen, but no doubt experienced for yourself the sudden, very real loneliness and desolation such a landscape invites. Even a “garden” in a populated city, such as the Garden of Gethsemane in Jerusalem (photo above), is a dry and arid place, a perfect setting for Christ’s final agony of spirit. 

     Not surprisingly, as both a traditional Catholic and a writer, I don’t believe in the “literal translation” of the Bible, but I do believe in the enduring truth of its overall message: “A voice cries out in the desert, prepare the way of the Lord!” (Isaiah 40:1-3)  Life is a journey and we are all wandering, sometimes aimlessly, toward something, somewhere, someone. Lent is a time of transition, a time to find the way forward out of whatever spiritual desert we may have wandered into toward a renewed promise of hope. We want to believe that our eternal spring is coming, and we want to be ready when it does, but faith falters and sometimes we have only the habits and rituals of that faith to keep us tethered. We are, after all, only human:

                  Although I do not hope to turn again 

                  Although I do not hope

                   Although I do not hope to turn

      Wavering between the profit and the loss

                    In this brief transit where the dreams cross

                    The dream-crossed twilight between birth and dying

                                                                                       Ash Wednesday, T.S. Eliot                                                        

         I was introduced to the work of T.S. Eliot in 10th grade in Catholic school by a gifted, demanding English teacher named Sr. Gabriel. To be sure, it was quite an undertaking with a bunch of 14 year olds, but as I later discovered myself while teaching high school, precocious teenagers who think they know everything love nothing better than the chance to prove it by ferreting out obscure references and abstract symbols. At any rate, I have been reading Eliot’s work, studying it and teaching it, and loving it, ever since.  Many of his best-known poems, including The Waste Land and The Hollow Men, have to do with the emptiness of existence without hope, without faith. For me, they make perfect Lenten “literary” meditations.

     Ash Wednesday, often called his “conversion poem” because it was written after Eliot’s conversion to Anglicanism in 1927, is about the difficulty of religious belief in an age of uncertainty — uniquely relevant today.  But, is the fervent hope for salvation and life everlasting the same as a fervent belief in it? I don’t think so. The poetic persona through the five movements of the poem moves from utter despair through confusion and exhaustion to some final resolution of acceptance. Whether or not that acceptance is a true “conversion” or simply a form of surrender seems debatable. The opening lines of the last stanza of the final movement, to my mind some of the most beautiful and most sensible words to live by, may hint at a new beginning, but not necessarily one grounded in the fullness of faith:

                   Suffer us not to mock ourselves with falsehood

                   Teach us to care and not to care

                   Teach us to sit still

    The personal struggle for spiritual wholeness is not easy to describe or comprehend. Nor is the the work of T.S. Eliot. One of his often-quoted maxims might appropriately be applied to both: “…genuine poetry can communicate before it is understood.”  Amen.

comment 0

At Downton Abbey

      I just returned from seeing my old friends, Lord and Lady Grantham. They, along with members of the Crawley family and most of their household staff, are “touring” in America. Of course, they brought the accouterments of their lovely country home in England with them. Quite a transatlantic undertaking, to be sure. I had spent six seasons with them since 2011, and have terribly missed seeing them these last couple years, so I was delighted to be able to enjoy their company and their way of life once again.

     I am, of course, talking about Downton Abbey: The Exhibition, which is now touring the United States after opening in New York City to rave reviews last year. Currently, The Exhibition is on display in West Palm Beach, Florida, in a huge space (once a Macy’s department store) downtown at CityPlace. The Downton Abbey series , created and written by Julian Fellowes, ran on PBS for six years, garnered 15 Emmys, and by any measure became the most successful British television costume drama since the 1981 series of Brideshead Revisited (based on Evelyn Waugh’s 1945 novel). Downton’s international appeal fostered a fiercely-devoted, worldwide fan base that has now spawned a whole Downton franchise of products and publications; the Exhibition serves to keep that fan base and franchise alive, as well as providing exciting pre-publicity for the Downton Abbey full-length feature film to open in September.

     Hey, it all works for me! In case you missed it — if unfortunately you fell into a coma or lived off the grid somewhere — Downton Abbey was an original six-year long series that followed the lives of the Crawley family, headed by Robert, the 7th Earl of Grantham, and his American-born heiress wife, Clara, the Countess of Grantham. Through roughly 12 years of historic change and social upheaval in post-Edwardian England, these aristocrats struggle to maintain not only the Downton Abbey estate and the way of life within, but also to supply employment and meet their obligations to the villagers who live on that estate. For the most part, they have managed to do that through the considerable dowry Clara brought to her marriage, but the rise in socialism, the increase in taxation, and the changing technology of farming and trade brought sudden and significant challenges to the influence and power, and to the finances, of the ruling class (which is why so many of the well-born were willing to exchange their titles for money in marriage to the daughters of self-made American millionaires). 

     To further complicate the situation at Downton, Lord Grantham has three daughters, but no sons. Only a son had the right of inheritance to an estate and a title.  For the 7th Earl of Grantham, whose property had been in his family for 500 years, the single most important goal in his life was to preserve his legacy and pass it on to future generations. That goal was shared by ALL the families at Downton, by the way, both the blood-related Crawleys upstairs and the work-related service staff downstairs.  In point of fact, between the two World Wars, aristocratic land owners became increasingly unable to achieve that goal, so their estates were broken up and their assets sold off for the first time in centuries. That spelled disaster and displacement for everyone concerned. According to The Exhibition catalogue, “Between 1913 and 1939 more of England changed hands than at any time since the Reformation in the 16th century.” (p.33)

     Evidence of such reduced circumstances endure today, even in Highclere Castle where Downton Abbey was filmed. The 5,000 acre estate has been home to the Earls of Carnarvon since 1679, including the famous 5th Earl of Carnarvon who accompanied archeologist Howard Carter in the discovery of the tomb of Tutankhamun in 1922. But, by the 21st century, Highclere had become largely uninhabitable forcing the current 8th Earl and his family to live in a modest cottage on the grounds. Ceilings had collapsed, stonework had crumbled, and repair estimates soared at £12 million — a state of affairs that the 8th Earl himself attributed to the careless mismanagement by his ancestors.

     But then Highclere was discovered! Along came film crews, along came paying visitors, and along came a change in fortune that the current Countess of Highclere gratefully attributes to the on-site filming of Downton Abbey. Today, much of Highclere has been renovated and repaired, the family once again lives downstairs in the castle, and the estate is open to the public for visits and special events during the summer months. 

     And it was this state of affairs that actually made the fabulous Downton Exhibition possible, because while all the outdoor scenes were shot on location at Highclere, the everyday scenes of daily drama were difficult to film inside due to tight spaces, poorly maintained interiors, and the priceless arts and antiques that could be damaged. So, room sets, precise and authentic down to every detail, were recreated at Ealing Studios in London. It is these sets that have travelled and been reassembled for The Exhibition, and they are incredible: the servants’ quarters, Mrs. Patmore’s kitchen, Carson’s pantry, Mrs. Hughes’ sitting room, Lady Mary’s bedroom, and the truly magnificent formal dining room, complete with table setting and dried floral arrangements (photo above).

     And then there are the costumes, exquisite in detail and workmanship, authentic in fabrication and embellishment, and made specifically for the actors who would wear them. Being a seamstress, I was as mesmerized by the construction of the attire of the butler, footmen and maids as I was by the wedding gowns, day dresses and “hunting pinks” of the aristocracy; being a student of history, I was enthralled by the clear evolution of style that indicated changing times. 

     Most of all, though, Downton Abbey: The Exhibition brought me back to those Sunday nights of romance and relief when I could escape into a bygone world of elegance and etiquette and forget, at least for an hour or so, my own worries and the tensions and divisions erupting in the world around me. And, as I think about it, we here in the early part of the 21st century are very much in the same state of upheaval and transition as the Crawleys were in the early part of the 20th century. The only difference is that we can no longer rely on common rules of manners and civility to keep our behaviors in check. 

     “Why do the rituals, the clothes, and the customs matter so much?” Tom Branson, the Irish chauffeur who ultimately marries the Earl’s daughter, Sybil, asks Violet, the Dowager Countess of Grantham (Maggie Smith) at one point.

     “Because without them we would be like the Wild Men of Borneo,” she replies.

     And so we are.

     Note: Reference P.T. Barnum for the “Wild Men of Borneo.”