Jane Austen and Pink Body Suits

I didn’t discover the genius of Jane Austen until my mid-30s. Blame my preference for historical non-fiction, and the fact that my high school and college literature instructors—who knows why—preferred the Brontës and George Eliot. I penned numerous papers on Jane Eyre and Silas Marner—all extremely uplifting and elucidating, I’m sure—but not a single essay on Austen’s books. I’m embarrassed to say my affection for Austen did not begin until I watched the BBC production of Pride and Prejudice—long after its first release. Four years later—unwittingly—I was a bona fide fan veering dangerously close to devotee. When another big screen version of Pride and Prejudice was released, I tracked the dialogue and shook my head when the screenwriter chose her own poor substitutes over Austen’s tidy, biting repartee.  Austen often inspires fanatic devotion.

Austen’s comic style developed early in her writing—by age 12—and her precocious efforts were encouraged by her outwardly conventional family, though she published her first book anonymously. Some fellow authors praised her work, including Sir Walter Scott, but she also had many critics, including Charlotte Brontë. Austen did not gain widespread appeal until the late nineteenth century; this interest has since evolved into unadulterated adoration. Academic heavyweights like Cornel West now discuss Austen’s work, and graduate students dissect her work from every conceivable angle. But as novelist Anna Quindlen writes, “Serious literary discussions of [Austen’s work] threaten to obscure the most important thing about it: It is a pure joy to read.” Similarly, our own part-time Brattleboro resident Rudyard Kipling delighted in reading Austen’s work with his family. He greatly admired her gift for writing about human follies with a “delicate hand and a keener scalpel.” Quite simply, she’s a riot.

For those for whom reading Austen is not enough, there’s JASNA: the Jane Austen Society of North America. Founded in 1979 by three “Janeites”, the inaugural meeting at Gramercy Park Hotel in Manhattan had about 100 attendees who met to celebrate Austen’s work. As the New Yorker reported at the time, they wore dressy but comfortable clothing at the event.  JASNA’s 33rd convention, which took place over three days this month, by contrast, lured over 700 participants—many of them dressed in Regency gowns and period-appropriate waistcoats—to worship at the Austen altar. They represent the 4,000+ members of JASNA scattered across the North American continent. If you have a moment, go online and view the October 9th slideshow of the gathering on the NY Times website; these folks take adoration to a new level.  Said one attendee, “This is a place where people can let their Jane Austen freak flag fly.”

JASNA can be a little intimidating and—dare I say—freaky. But I felt a kindred spirit with the “Janeites” this month. Here’s what happened. The annual JASNA convention coincided with the sewing of my own “freak flag”. I read the NY Times coverage during breaks from making a costume for a very different gathering.  My spouse and I decided to run the CHaD Hero half marathon—a benefit for Children’s Hospital at Dartmouth—fully clad in the garb of our favorite 1970s superheroes, Electra Woman and Dyna Girl. The theme of the race was “Be a hero for kids”, but honestly we’ve always wanted to run a race in fabulous costumes. It was the perfect vehicle; we knew we wouldn’t be the only superheroes. We were, however, the only ones in hot pink and neon orange body suits—complete with glittering gloves, sparkling spats and fluttering capes. We’d both spent years on stage in theatrical productions, but—as you can imagine—there’s nothing quite like running 13.1 miles through the streets of Norwich, VT and Hanover, NH in “electrifying” apparel.

It would be dishonest to say that I didn’t feel self-conscious at first. Skin-tight body suits are not my “go-to” gear, and my retro-fitted elbow-length gloves kept getting caught on my sports watch Velcro.  We also suffered unforeseen technical issues with our capes and didn’t fully anticipate the complexity of pre-race pit stops while navigating the body suits. But once we stepped across the starting line and strode out with shoulders back and heads held high, we had a ball. And here’s the best part: We only needed to run by people in order to pass on the joy. Spectators, race organizers, traffic directors—as well as those stuck in traffic—couldn’t help but laugh and smile as we flashed by. A lead singer of one of the many bands who performed along the route burst out laughing when my shimmering cerise suit sailed by his stage. Although we finished at the front of the middle of the back of the pack, we felt exuberant in our super-ness.

Jane Austen might not have approved of the spectacle we made of ourselves, but she certainly would have seen the humor in it all. She—who fought against societal disapproval of women writers—would also appreciate the desire to break from convention. When actress and writer Emma Thompson won the Golden Globe for her screenplay for the adaptation of Austen’s Sense and Sensibility, she delivered her acceptance speech in the character of Austen, and accordingly was able to make suitably piercing comments about the entire affair. The speech, like Austen’s own writing, is daring, hilarious and spot-on appropriate for the occasion. Thompson—and Austen before her—understood that we are sometimes most truthful and true to ourselves when we adopt another persona.

Like the JASNA participants, flout convention and let your freak flag fly.

Preschool Teachers and Pint-Sized Scientists

Last month a preschool teacher I know asked a townsperson to visit her classroom to discuss an important issue with her students. His response? “Sorry, I have a real job.” She felt ambushed by his reply and could not manage a clever retort. Instead she was left feeling—once again—unappreciated and misunderstood. Want to experience “real” work?  Visit—or volunteer in—a preschool classroom. Try corralling and educating the pure rocket fuel that is barely housed in the body of a four-year-old; then you might understand a little of what these educators actually do.

In the educational pecking order, early childhood educators ought to be treated like the preening, cocksure rooster—valued, respected, and revered. Instead, they seem to rank somewhere just above stewing chicken. There are many reasons for this. Early childhood educators do traditionally “woman’s work” and are paid less than other teachers. Our society often equates pay with status. Childcare is distressingly undervalued, and any classroom that—to the uniformed—looks like childcare is similarly unappreciated. Certainly it also has something to do with the fact that everyone seems to think they can do the job. How hard can it be? Hang out and play with kids?

But play is serious business. Dr. Alison Gopnik—psychology professor at UC Berkeley and a leading cognitive development researcher—asserts that “children at play are like pint-sized scientists testing theories.” Gopnik focuses on why young children can learn so much so quickly, and she’s narrowed in on a kind of thinking children do while pretending: counterfactual thinking. Counterfactual thinking is the process by which we express what did not happen but could have happened if conditions had been different. (If she hurried she would have caught the bus. Or: If kangaroos had no tails, they would fall over.) While studying 3 and 4-year-old preschool students, Gopnik and her team found that children who were better at pretending were also better counterfactual thinkers. Children at play, much like scientists in the lab, “imagine ways the world could work and predict the pattern of data that would follow if their theories were true, and then compare that pattern with the pattern they actually see.” Gopnik argues that preschool play—and the counterfactual thinking embedded in it—is vitally important for children but also for the larger society. As Gopnik stresses, “it is a crucial part of what makes all humans so smart.”

Preschool teachers not only help to shape the minds of future scientists, they also more effectively train future workers than many job-training programs.  NPR reporter Alex Blumberg reports that University of Chicago professor—and Nobel Prize winning economist—James Heckman set out to understand why a job-training program did not help a group of workers get better jobs. He realized that workers enrolled in the job-training program still did not have critical skills needed to advance in the work world. He identified these as “soft skills” like paying attention, focusing on an activity, being curious, resolving differences, and controlling one’s temper. Heckman argues these skills are learned primarily in preschool, and this makes preschool an excellent job-training program. He cites the Perry Preschool Project—a 1960s experiment in Michigan—in which 3- and 4-year-olds from poor families were divided into two groups; one received preschool education and the other did not. The students then attended the local public schools and continued to live in the same community. Twenty years later, the men and women who once attended preschool were much less likely to be incarcerated, earned much more money, were far more likely to have savings accounts, and were more likely to be employed. Other studies have shown similar results.

Early childhood educators are also on the front lines in the battle to protect imaginative play. We can poke fun all we want about “choice time” and how it doesn’t stack up against hard academics, but the research—a lot of it—says otherwise. Imaginative play develops a vital cognitive skill called executive function, which has many elements, but the main one is self-regulation. Underdeveloped executive function has been linked to high dropout rates, drug use and crime. Conversely, healthy executive function is a good predictor of academic success in school. Laura Berk—longtime executive function researcher—explains that during make-believe play, children engage in something called private speech. A child will discuss with herself what and how she’s going to play. Berk says, “[T]his type of self-regulating language is highest during make-believe play…and has been shown in many studies to be predictive of executive functions.” Private speech declines precipitously when children’s play is more structured and proscribed.

Apart from shaping minds and developing future scientists and workers, preschool teachers are highly skilled translators. Despite having my own two small children, I still feel like a traveler in the Casbah when visiting my son’s preschool classroom. Recently, I had two incidents in which I absolutely could not understand what students said to me. Each time I turned to one of the teachers for help. They focused entirely on the child and accurately translated what sounded like utter gobbledygook to me. I don’t have the gift, and I am grateful they do. They guide our kids on the path from clever cavemen to fully functioning homo sapiens—no small endeavor.

I suspect that our condescension towards early childhood educators reveals something deeper: A dismissive attitude about young children and what they’re capable of achieving. In this, we show our own Neanderthal tendencies.

In Defense of The Bard

Have you heard the one about how American astronauts never really landed on the moon? It was an elaborate ruse—created on a Hollywood sound set—and perpetrated on the entire world. NASA has been able to keep this colossal secret for half a century. But a decades-old secret is nothing compared to one that stretches back to the 1580s. Like our poor American astronauts, William Shakespeare has become a favorite target of those who desperately want to be in on a fantastical secret that the rest of us are too stupid or naïve to figure out. If you’ve not heard, Shakespeare is not really Shakespeare.

I’m re-reading a slim but highly entertaining biography of William Shakespeare by award-winning writer Bill Bryson, Shakespeare: The World as Stage. Bryson keeps this volume to just under 200 pages by focusing on what we actually know about Shakespeare’s life from the historical record.  Shakespeare After All—a surprisingly readable 1,000+ page tome by Harvard University’s Margorie Garber—is another option for those who have either retired or sworn off email, but the rest of us must be grateful for Bryson’s “bite-size” volume. I’d happily flown through Bryson’s lively biography once before, but did not stop long to ponder the ridiculous theories about Shakespeare’s “true” identity.

William Shakespeare’s contemporaries did not question the authorship his plays. The Master of the Revels’ accounts from 1604-1605—the official record of plays performed before the king—lists Shakespeare as the author of seven plays performed for King James I. As Bryson points out, you can’t get much more official than that. Shakespeare is also recorded in primary source documents as the author of the sonnets and the poems The Rape of Lucrece  and Venus and Adonis. And a thinly veiled, clearly envious contemporary account published by pamphleteer Robert Greene—Greene’s Groat’s-Worth of Wit—charges that he’s an upstart theatrical player who’s reached above his station by taking up writing. Clearly, the Elizabethans accepted his authorship.

The doubts started much later. They did not have auspicious beginnings. Delia Bacon, born in the frontier country of Ohio in 1811, gradually became convinced—no one knows why—that Francis Bacon (no relation) was actually the author of Shakespeare’s canon. Her research methods were dubious—non-existent, really—and her writing unintelligible. Although Nathanial Hawthorne wrote a preface for her book, The Philosophy of the Plays of Shakespere [sic] Unfolded, he hadn’t actually read it; almost immediately, he wished he had. He later said, “This shall be the last of my benevolent follies, and I will never be kind to anybody again as long as [I] live.” Despite it being a critical bomb, her harebrained book became surprisingly popular among such luminaries as Mark Twain and Henry James. It didn’t occur to them that Francis Bacon—philosopher, statesman, jurist, scientist and author—was probably a bit too overextended to have also penned Shakespeare’s plays, as well as the works of Christopher Marlowe, Edmund Spenser and others, as other Baconians assert. Time constraints aside, Bryson notes, Francis Bacon openly disparaged the theater.

Anti-Stratfordians needed a new candidate. Enter J. Thomas Looney (seriously), a British schoolmaster, who in 1918 published Shakespeare Identified in which he argues that Shakespeare’s work was actually written by the seventeenth Earl of Oxford, Edward de Vere. Looney asserts, as do all subsequent Oxfordians, that William Shakespeare lacked the erudition, sophistication and the refinement to have produced such incomparable literature. The Earl certainly had all these qualities, but there are numerous problems with this theory. As Bryson explains, his personality was patently abhorrent in every possible way. It is difficult to imagine him possessing “the compassionate, steady, calm, wise voice that speaks so reliably and seductively from Shakespeare’s plays.” He was also terribly vain, and it seems highly improbable that he would publish his most brilliant work under a pseudonym. But there’s another compelling reason the Oxfordians are dead wrong: Edward de Vere died in 1604—before many of Shakespeare’s plays appeared. Did he really leave a stack of manuscripts to a confidant to be released at regular intervals after his death?

Why would people favor a dead guy over William Shakespeare? Conspiracy theories proliferate when primary documents are scant. There’s much we don’t know about Shakespeare. But that’s not unusual for an average citizen of Elizabethan England. The dearth of primary sources aside, the meat of the Oxfordian argument is that a person from Shakespeare’s humble roots could not demonstrate such breadth and depth regarding the human condition. They question how a tradesman’s son could possibly write about Italy, Denmark and Scotland, never having gone to university. But they don’t question contemporary Ben Jonson’s legitimacy as a playwright, and he also never went to college. As Shakespeare researchers have shown, William Shakespeare’s country upbringing permeates his work—from references to dandelions as “golden lads” and allusions to alderman’s thumb rings, to his frequent mention of the minutia of the tanning trade.

The classism of the Oxfordians is unmistakable, which is peculiar given that many of them accuse Shakespeare scholars of being elitist in not taking their theories seriously. Could the son of craftsman have created some of the finest literature in the Western World? Certainly. Although it’s fun cocktail banter and may suggest a quest for truth, the anti-Shakespeare hoax reveals our own lack of imagination and our denial of man’s inexorable creativity. Whether it’s sending man to the moon or writing heavenly prose, we can be an exceedingly clever species.

Let Shakespeare be Shakespeare.

 

Remembering Ambassador “Krees”

It’s been a month since Ambassador Christopher Stevens—a dynamic and gifted diplomat—was killed in the sudden and terrifying attack on the American consulate in Libya. The State Department continues to gather information, but it’s clear that the violent confrontation was not the result of a spontaneous riot incited by an amateurish anti-Islamic video posted by on YouTube. It was a pre-meditated assault by a terrorist cell operating in Benghazi. Although many have rightly questioned the State Department’s inadequate protection of Stevens, the ambassador himself would not have wanted his legacy overshadowed by these questions; he often opted out of tight security measures.  He would want us to remember that he was killed while working to serve the world as well as his country. He approached his job like few others in the Foreign Service do. As he demonstrated from posts in Syria, Israel, Egypt, Saudi Arabia and Libya, he sincerely wanted to live with and among the people. This love of people and place was both his greatest strength and his undoing, as the Libyan reaction to his death demonstrates.

Tom Malinowski, writing in Foreign Policy magazine, poses this question: “Could anyone, whether cynic or optimist about the region, have dreamed of a better response to an attack on a diplomatic mission on Arab soil than what happened after the violence in Benghazi?” Unlike tepid condemnations from Egypt and elsewhere in the Mideast, tens of thousands of ordinary Libyans, many holding signs honoring Ambassador Stevens, marched on the headquarters of the militias believed responsible for the assassination. Malinowski notes that, in addition to the sizable citizens’ protest, Libya’s highest religious leader issued a fatwa against Stevens’ killers and local Islamist leaders openly expressed their desolation. Said one to Malinowski, “You can’t imagine how sad we are.”

Many Libyans felt like French writer and activist Bernard-Henry Lévy who wrote, “The fanatics who assassinated America’s ambassador to Libya…were not only criminals—they were imbeciles.” They killed a man who dreamed big dreams for Libya and who did the difficult work on the ground to free Libyans from Col. Gaddafi’s brutal and bizarre 34-year rule. Ambassador Stevens, according to NY Times writer Steven Erlanger, “traded personal risk for personal contact” so that he could build “a bridge to the tribes and militias who toppled the Libyan dictator.” As a brilliant diplomat and a friend to Libyans, he strove to create a new relationship between the U.S. and Libya.

He sought to usher in this new era of relations with Libya by employing what Stevens’ dear friend, Iranian-American writer Roya Hakakian, calls “his own inner cultural stethoscope.” “No matter where he was,” she recalls, “he could always hear the beating of local life.” Gregarious and curious, he was more interested in hearing others talk than in hogging the spotlight himself. And that, Hakakian explains, positioned him in stark contrast to the usual Mideast stereotype of the greedy, arrogant and dimwitted American. Journalist Noga Tarnopolsky, who knew Stevens from Jerusalem, concurs: “Wherever he was living, he was able to let go of everything else and live that place completely.” Because of this, Stevens sometimes eschewed security measures so he could be the genuine and accessible diplomat he needed to be to remain true to himself. He wanted real human contact. He was not comfortable in his job when barricaded in a heavily guarded embassy.

Washington Post reporters, Ernesto Londono and Abigail Hauslohner, have documented the insufficient security at the U.S. diplomatic outpost in Benghazi. Although an American military team was assigned to protect the new embassy in Tripoli, they were not directed to buttress security at the provisional diplomatic center in the east of the country.  Instead, the consulate was defended by a local guard force employed by a British private security firm as part of a paltry contract “worth less than half of what it costs to deploy a single U.S. service member in a war zone for a year.” This unit was easily outgunned on September 11th when militants attacked with guns and rocket-propelled grenades. The irony is that Stevens died in the compound he only grudgingly accepted. As a Libyan friend of Stevens told the Washington Post, “Benghazi was his home. He used to run outside the compound. He felt very safe going to markets, to the square, meeting friends for coffee.” These local friendships were vital to him, and they served as a compelling compass in his work. When sending emails out to friends, he often signed his name “Krees” instead of Chris—embracing the pronunciation of his name that his Arab friends used.

Over the past month, I have often found myself thinking about Ambassador “Krees.” What is it about this talented diplomat that has me ruminating about his lonely death—in a city I can barely imagine—thousands of miles away? Certainly his toothy grin, his openness, and his dedication to his work are all appealing aspects of his story. But there’s something else. In the service of his country and our world, Stevens rejoiced in the basic humanity he found in the world’s distant corners. He fiercely believed—and lived his conviction—that he could build community wherever he found himself. Some now call him a martyr or hero or the Indiana Jones of diplomacy. I’m just enormously grateful this lanky Californian saw himself in the faces of all his friends and neighbors the world over. What a wonderful gift to us all.

 

 

Grandma’s Kung Fu Fighting

My mom was not always a Kung Fu black belt. Like many women of her generation, she worked extremely hard raising children and managing our home. She lived oversees twice—in Germany and in Switzerland—for my father’s job. When we were out of elementary school, my mom fixed timepieces in a watch factory and then slogged away at an insurance company call-center. She discovered her true calling after we all left for college: She took up tai chi to help her aching back. My mom did not know then that these first tentative steps were the beginning of a career.  Now my parents’ home is full of medals and trophies won, certificates earned, and weapons my mom wields. Dad’s adjusted to carrying her cache of swords, staffs, fighting fans and other assorted bludgeons, but she’s not quite accustomed to the accolades she receives as a well-respected martial arts instructor. From house frau to crouching tiger—she reinvented herself.

We constantly allow—encourage, even—politicians, celebrities, and all too many televangelists to reinvent themselves, but we rarely give ourselves the same permission. We imagine our own identities as fixed and unchangeable.

In this light, I recently reconsidered a conversation I had with a friend last year.  While discussing her house renovations, I asked if she had taken any “before and after” pictures of the work they’d done. She replied quickly and decisively, “Oh, no. We’re not those kinds of people.” I was struck dumb: What does snapping a few home remodeling pictures say about a person? And why is she so adamant that they are not “those kinds” of people?  We regularly and unconsciously define ourselves against an imagined Other and then use this comparison to create a sense of ourselves. Based on our memories and stories we tell about ourselves, these notions are just that—ideas. We make these stories The Story of our lives. And The Story—if we let it define us—limits us, reducing the choices we imagine for ourselves.

Ulric Neisser—the highly influential psychological researcher—challenged our long-held beliefs about the accuracy of our remembered stories. Neisser died in February, and Douglas Martin of the New York Times remembered him as a scientist who “helped lead a postwar revolution in the study of the human mind” through his research on perception and memory. He resisted the dominant post-WW II psychology discipline, behaviorism, and in the process created a new field—Cognitive Psychology. As Martin explains, Neisser’s research showed that “memory is a reconstruction of the past, not an accurate snapshot of it.” People think they remember actual events in near-perfect detail when they actually remember memories—and in some instances they remember memories of memories. Not great stuff upon which to build a static identity.

Years ago an acquaintance of mine wanted to quit smoking, but—in addition to the addiction itself—she simply couldn’t get beyond her self-concept. When she saw joggers zip past her house, she mocked the runners, inwardly or with her friends. She never imagined that she could be that runner she was so busy ridiculing; she defined herself as A Smoker. Then it occurred to her that this was simply a story she told about herself, and she could actually tell a different story. Over time she altered her ideas about herself and tried out another identity: Runner.

It is difficult to change the story if your friends discourage it. Nicholas Christakis, a social scientist and internist at Harvard, and James Fowler, a political scientist at UC San Diego, document this difficulty in their groundbreaking social network mapping of the famous Framingham Heart Study (FHS). This research—originally started in Framingham, MA in 1948 with over 5,000 subjects—is a cardiovascular study that now spans three generations. Christakis and Fowler mapped the social networks of the original FHS participants and discovered that obesity, smoking, and even happiness appear to be passed among friends and families just like viruses. There is strong evidence that the 12,630 individuals in the FHS social network spread obesity person-to-person. If your friends or family members became obese, you were much more likely to become obese yourself. They theorize that obesity becomes normalized among cohorts, and this shifting view of obesity makes it much more permissible to gain weight. Similar patterns emerged when they tracked smoking rates among the FHS social network. Says Fowler, “People quit together, or they didn’t quit at all.” They even observed that happy people tend to hang out with happy people and spread happiness within their social networks.

Napoleon Hill—writer and advisor to presidents FDR and Wilson—asserted, “What the mind of man can conceive and believe, it can achieve.” His book, Think and Grow Rich, is one of the bestselling books of all-time and still—75 years later—makes “Top 10” lists of business books. Hill identified something that many of us know but find difficult to put into practice: Our ideas absolutely help shape who we become, and we can change these ideas to become who we want to be. So, don’t dump your friends or cut yourself off from that cousin who constantly berates you about your weight or your housekeeping. But do surround yourself with people who believe that transformation is possible.

My mom’s facing major surgery this year; she may have to stop Kung Fu fighting. As painful as this will be, we in her network will encourage her to transform once again.  I can’t wait to see what comes next.