As a young child, I received an allowance of 25 cents a week. (A quarter was worth a lot more 'way back then.) From that I was expected to allocate some to spend as I pleased, some for the offering at church, and some to be saved into my small account at the bank. That was the beginning. My family had a culture of saving, as well as giving and spending. Saving was for the future—for larger-ticket items, and for unknown future needs.
Part of the excellent advice I received from my father as I was establishing my own household was to set up a regular savings plan, not only for future purchases but to ensure that I could handle at least a six-month period of unemployment—preferably a full year. Of course it took some time to save that much money when I had all the expenses of newly-independent living to meet, but by making it a priority I soon had a comfortable cushion against unexpected expenses.
Fortunately, I married a man with similar views, which were not uncommon among those of us whose parents had lived through the Depression days. For a number of years we were blessed with two incomes, but made a point of keeping our standard of living low enough that we could live on one and save the other. This stood us in very good stead when disaster hit the American information technology industry, and so many IT workers lost their jobs because the work was transferred to India and other places overseas.
But somewhere along the line the culture of saving was largely lost. Once considered a virtue, saving is now called "hoarding" and held in contempt. It seems to be considered a patriotic duty to spend all one's money—and more. (If true, we have been bleeding red, white, and blue during this pandemic.) However, the ugly consequences of this attitude are nowhere more apparent than in the large numbers of families facing financial disaster due to pandemic-related job loss. So many people have gone in the blink of an eye from enjoying comfortable incomes to standing in bread lines. If they had been encouraged to follow my father's advice and maintain a savings cushion of a year's salary, they would likely have been able to weather this storm with ease. But no one—not the government, not the media, not the schools, not our consumerist society, and apparently far too few parents—has been passing on this essential lesson.
I hope it won't take another Great Depression to recover our lost wisdom.
We're nearing the end of the year, and I've been (very pleasantly) inundated with more important ways to use my time than writing blogs posts (more on that later). That's not to say writing isn't important to me; indeed, I find it essential for my mental health. However, to everything there is a season, and this season is writing-limited. So it seems like a good time to some end-of-the-year decluttering of my collection of random ideas that can be dealt with relatively quickly. Here's one, a six-minute video by the remarkable Larry Elder that expresses well my own personal impressions of the effect of President Lyndon Johnson's "War on Poverty" (plus several other social factors), as well as what I learned during high school from Mr. Jim Balk, the most remarkable history teacher I ever had.
Feeling the time pressure here, so you you get a quick post today. I no longer remember how I happened upon this video, but the Mister Rogers' Neighborhood fans among you (including Heather, pictured below) and/or jazz fans might enjoy it even more than I did.
I've been having fun cleaning out computer files, and came upon some old photos that may me wonder just how much our personalities are determined at birth or very early in our childhoods.
Here I am at age five, drinking water after enjoying a hike in the Adirondacks, wearing comfortable clothes and not sitting cross-legged. Comfort remains my highest criterion for choosing clothing, I have always loved spending time in the woods, water is my beverage of choice, and I have never, ever been able to sit cross-legged without serious discomfort.
Here's evidence from age six that my mother did her best to nurture my feminine side and teach me to be a proper young lady of the times. It didn't take. You can see by the expression on my face my heroic struggle to endure privation and torture for completely unfathomable reasons. Nothing has changed.
As a young mother, Thanksgiving at my in-laws' place in Moncks Corner, South Carolina, usually found me collapsed on the hearth in front of a comforting fire. This picture from Christmas at my grandparents' house in Rochester, New York shows that when I was seven I felt much the same way.
When this swing was new, it graced my grandfather's house. When he moved in with my family in Pennsylvania, the swing came with him. That particular swing is gone now, but when Porter found one for all practical purposes identical at our local Lowes, I had to have one for our Florida back porch. North or south, for more than 60 years I have found it one of the most comfortable places ever to sit, rock, read, think ... and sleep.
I don't mean we can't change. Nature vs. Nurture should no longer be a debate, since the answer is so clearly "both/and." But it still surprises me when I come across evidence—in myself and others—that many of our present characteristics were manifested very early on, if we'd had the eyes to see. Mostly it's fascinating; it only becomes depressing when I look back and realize I'm still fighting battles it seems I ought to have long since conquered and moved on.
So I wonder: Should we, as parents, be more alert to problems in our children that could lead to trouble in the future, and deal with them, rather than letting them slide and hoping they'll outgrow them? If so, which ones are manifestly bad and need eliminating (e.g. lying, laziness, disobedience), and which ones are just part of what makes us individuals (such as a need for solitude, a sensitive nature, or a very logical mind), in which case our job is to help the child both take advantage of the good and develop coping strategies for the bad?
Digital Minimalism: Choosing a Focused Life in a Noisy World by Cal Newport (Portfolio/Penguin 2019)
Janet recommended this one to me, and after checking out Newport's TED talk, "Why You Should Quit Social Media," I decided to reserve it at the library. I had to wait in line; maybe more than a few people are rethinking Facebook, Twitter, Instagram, Old Uncle Tom Cobleigh and all.
Digital Minimalism is divided into two parts: Foundations, and Practices. I read through Foundations easily, able to enjoy the book without pasting sticky tabs all over it. For me, this is like going somewhere and not taking pictures. Those sticky notes represent text that I will later laboriously transcribe for my reviews. As with the photos, something is gained but something is lost. I was enjoying the book and anticipating an easy review.
Then I hit Practices. Or Practices hit me.
The first chapter of that section, "Spend Time Alone," is about solitude deprivation. I could have sticky-noted the whole chapter. Here is me, restraining myself:
Everyone benefits from regular doses of solitude, and, equally important, anyone who avoids this state for an extended period of time will ... suffer. ... Regardless of how you decide to shape your digital ecosystem, you should give your brain the regular doses of quiet it requires to support a monumental life. (pp. 91-92).
[Raymond] Kethledge is a respected judge serving on the United States Court of Appeals for the Sixth Circuit, and [Michael] Erwin is a former army officer who served in both Iraq and Afghanistan. ... [Their book on the topic of solitude], Lead Yourself First ... summarizes, with the tight logic you expect from a federal judge and former military officer, [their] case for the importance of being alone with your thoughts. Before outlining their case, however, the authors start with what is arguably one of their most valuable contributions, a precise definition of solitude. Many people mistakenly associate this term with physical separation—requiring, perhaps, that you hike to a remote cabin miles from another human being. This flawed definition introduces a standard of isolation that can be impractical for most to satisfy on any sort of a regular basis. As Kethledge and Erwin explain, however, solitude is about what’s happening in your brain, not the environment around you. Accordingly, they define it to be a subjective state in which your mind is free from input from other minds. (pp. 92-93)
You can enjoy solitude in a crowded coffee shop, on a subway car, or, as President Lincoln discovered at his cottage, while sharing your lawn with two companies of Union soldiers, so long as your mind is left to grapple only with its own thoughts. On the other hand, solitude can be banished in even the quietest setting if you allow input from other minds to intrude. In addition to direct conversation with another person, these inputs can also take the form of reading a book, listening to a podcast, watching TV, or performing just about any activity that might draw your attention to a smartphone screen. Solitude requires you to move past reacting to information created by other people and focus instead on your own thoughts and experiences—wherever you happen to be. (pp. 93-94).
Regular doses of solitude, mixed in with our default mode of sociality, are necessary to flourish as a human being. It’s more urgent now than ever that we recognize this fact, because ... for the first time in human history solitude is starting to fade away altogether. (p. 99)
The concern that modernity is at odds with solitude is not new. ... The question before us, then, is whether our current moment offers a new threat to solitude that is somehow more pressing than those that commentators have bemoaned for decades. ... To understand my concern, the right place to start is the iPod revolution that occurred in the first years of the twenty-first century. We had portable music before the iPod ... but these devices played only a restricted role in most people’s lives—something you used to entertain yourself while exercising, or in the back seat of a car on a long family road trip. If you stood on a busy city street corner in the early 1990s, you would not see too many people sporting black foam Sony earphones on their way to work. By the early 2000s, however, if you stood on that same street corner, white earbuds would be near ubiquitous. The iPod succeeded not just by selling lots of units, but also by changing the culture surrounding portable music. It became common, especially among younger generations, to allow your iPod to provide a musical backdrop to your entire day—putting the earbuds in as you walk out the door and taking them off only when you couldn’t avoid having to talk to another human. (pp. 99-100).
This transformation started by the iPod, however, didn’t reach its full potential until the release of its successor, the iPhone.... Even though iPods became ubiquitous, there were still moments in which it was either too much trouble to slip in the earbuds (think: waiting to be called into a meeting), or it might be socially awkward to do so (think: sitting bored during a slow hymn at a church service). The smartphone provided a new technique to banish these remaining slivers of solitude: the quick glance. (p. 101)
When you avoid solitude, you miss out on the positive things it brings you: the ability to clarify hard problems, to regulate your emotions, to build moral courage, and to strengthen relationships. (p. 104)
Eliminating solitude also introduces new negative repercussions that we’re only now beginning to understand. A good way to investigate a behavior’s effect is to study a population that pushes the behavior to an extreme. When it comes to constant connectivity, these extremes are readily apparent among young people born after 1995—the first group to enter their preteen years with access to smartphones, tablets, and persistent internet connectivity. ... If persistent solitude deprivation causes problems, we should see them show up here first. ...
The head of mental health services at a well-known university ... told me that she had begun seeing major shifts in student mental health. ... Seemingly overnight the number of students seeking mental health counseling massively expanded, and the standard mix of teenage issues was dominated by something that used to be relatively rare: anxiety. ... The sudden rise in anxiety-related problems coincided with the first incoming classes of students that were raised on smartphones and social media. She noticed that these new students were constantly and frantically processing and sending messages. ...
[San Diego State University psychology professor Jean Twenge observed that] young people born between 1995 and 2012 are ... on the brink of the worst mental-health crisis in decades. ... [She] made it clear that she didn’t set out to implicate the smartphone: “It seemed like too easy an explanation for negative mental-health outcomes in teens,” but it ended up the only explanation that fit the timing. Lots of potential culprits, from stressful current events to increased academic pressure, existed before the spike in anxiety.... The only factor that dramatically increased right around the same time as teenage anxiety was the number of young people owning their own smartphones. ...
When journalist Benoit Denizet-Lewis investigated this teen anxiety epidemic in the New York Times Magazine, he also discovered that the smartphone kept emerging as a persistent signal among the noise of plausible hypotheses. “Anxious kids certainly existed before Instagram,” he writes, “but many of the parents I spoke to worried that their kids’ digital habits—round-the-clock responding to texts, posting to social media, obsessively following the filtered exploits of peers—were partly to blame for their children’s struggles.” Denizet-Lewis assumed that the teenagers themselves would dismiss this theory as standard parental grumbling, but this is not what happened. “To my surprise, anxious teenagers tended to agree.” A college student he interviewed at a residential anxiety treatment center put it well: “Social media is a tool, but it’s become this thing that we can’t live without that’s making us crazy.” (pp. 104-107)
The pianist Glenn Gould once proposed a mathematical formula for this cycle, telling a journalist: “I’ve always had a sort of intuition that for every hour you spend with other human beings you need X number of hours alone. Now what that X represents I don’t really know . . . but it’s a substantial ratio.” (p. 111)
The past two decades ... are characterized by the rapid spread of digital communication tools—my name for apps, services, or sites that enable people to interact through digital networks—which have pushed people’s social networks to be much larger and much less local, while encouraging interactions through short, text-based messages and approval clicks that are orders of magnitude less information laden than what we have evolved to expect. ... Much in the same way that the “innovation” of highly processed foods in the mid-twentieth century led to a global health crisis, the unintended side effects of digital communication tools—a sort of social fast food—are proving to be similarly worrisome.(p. 136).
After winning me over with the chapter on solitude deprivation, Newport lost me somewhat with his approach to taming the beasts. The basic problem is that, for a guy who has written several books and has his own blog, he seems to have very little respect for the written word.
Many people think about conversation and connection as two different strategies for accomplishing the same goal of maintaining their social life. This mind-set believes that there are many different ways to tend important relationships in your life, and in our current modern moment, you should use all tools available—spanning from old-fashioned face-to-face talking, to tapping the heart icon on a friend’s Instagram post.
The philosophy of conversation-centric communication takes a harder stance. It argues that conversation is the only form of interaction that in some sense counts toward maintaining a relationship. This conversation can take the form of a face-to-face meeting, or it can be a video chat or a phone call—so long as it matches Sherry Turkle’s criteria of involving nuanced analog cues, such as the tone of your voice or facial expressions. Anything textual or non-interactive—basically, all social media, email, text, and instant messaging—doesn’t count as conversation and should instead be categorized as mere connection. (p. 147)
I heartily disagree with his lumping e-mail in with "all social media, text, and instant messaging." I will grant that most social media, texts, WhatsApp, IM, and the like are severely limited by the difficulty of creating the message. Phones simply are not designed for high-speed typing, and I don't know about other people's experiences, but for me voice-to-text makes so many errors I spend almost as much time correcting as I would have laboriously pecking out a message on the tiny keyboard. (That's why I much prefer WhatsApp, where I can type my messages on the computer keyboard, to texting, where I can't.) So messages tend to be short, of restricted vocabulary and complexity, and full of nasty abbreviations. But e-mails are simply typed letters that get delivered with much more speed than the mail can achieve. I will grant that you miss the tone-of-voice cues that can be heard over the phone, but I think that's often more than made up for by the ability to both speak and listen without interruption. On the phone, if I turn all my attention to what the other person is saying, there's a long silence when it's my turn to talk while I think of how I want to respond. But if I try to figure that out while the other person is speaking, I'm likely to miss, or mis-interpret what is said. And when I'm speaking, it's more than likely that I will get interrupted before getting out my entire thought, and the conversation will veer off in another direction, leaving my response incomplete and likely mis-understood. E-mail leaves plenty of time for listening, thinking, and responding.
Newport has serious problems with Facebook's "Like" button. I can see his point in some respects.
The “Like” feature evolved to become the foundation on which Facebook rebuilt itself from a fun amusement that people occasionally checked, to a digital slot machine that began to dominate its users’ time and attention. This button introduced a rich new stream of social approval indicators that arrive in an unpredictable fashion—creating an almost impossibly appealing impulse to keep checking your account. It also provided Facebook much more detailed information on your preferences, allowing their machine-learning algorithms to digest your humanity into statistical slivers that could then be mined to push you toward targeted ads and stickier content. (p. 192)
I do get the slot-machine analogy. We all crave (positive) feedback for whatever of ourselves we have put "out there." And the temptation to keep checking is real. It reminds me of the joke from 'way back in the America Online days, in which the person sitting at the computer (no smart phones back then) checks his mail, sees that there is none waiting for him—and immediately checks again. It was funny because that's what so many people did. But I think Newport misunderstands how many of us use the Like button.
In the context of this chapter, however, I don’t want to focus on the boon the “Like” button proved to be for social media companies. I want to instead focus on the harm it inflicted to our human need for real conversation. To click “Like,” within the precise definitions of information theory, is literally the least informative type of nontrivial communication, providing only a minimal one bit of information about the state of the sender (the person clicking the icon on a post) to the receiver (the person who published the post). Earlier, I cited extensive research that supports the claim that the human brain has evolved to process the flood of information generated by face-to-face interactions. To replace this rich flow with a single bit is the ultimate insult to our social processing machinery. (p. 153)
But here's the thing. I don't know anyone who pretends that clicking "Like" or "Love" or "I care" is conversation. However, it is the digital equivalent of one part of a successful conversation: the nod, the smile, the grunt, the frown, the short interjection, which in face-to-face conversation we used as an important lubricant to keep a conversation running smoothly. It hardly communicates any more information than the Facebook buttons; maybe it's little more than a bit—but it's an important bit. It says, "I'm listening, I hear you, I agree, keep talking," or "Wait, what you said confuses me, or angers me," or "I'm sorry, I sympathize."
As soon as easier communication technologies were introduced—text messages, emails—people seemed eager to abandon this time-tested method of conversation for lower-quality connections (Sherry Turkle calls this effect “phone phobia”). (p. 160)
Guilty as charged, but there's no need for Newport (or Turkle) to be snarky about it. I'm hardly alone, and there's ample evidence that phone phobia is attached to the same set of genes that makes me like mathematics. I love the (true) story a colleague told of a bunch of math grad students who decided to order pizza. Every one of them hemmed and hawed and delayed making the order, until the wife of one of the mathematicians, herself a grad student in philosophy, sighed, "For Pete's sake!" and called the restaurant. Text-based communication is a real boon to people like us. Call it a disability if you like—and then remember that you shouldn't mock or discriminate against people with disabilities.
Fortunately, there’s a simple practice that can help you sidestep these inconveniences and make it much easier to regularly enjoy rich phone conversations. I learned it from a technology executive in Silicon Valley who innovated a novel strategy for supporting high-quality interaction with friends and family: he tells them that he’s always available to talk on the phone at 5:30 p.m. on weekdays. There’s no need to schedule a conversation or let him know when you plan to call—just dial him up. As it turns out, 5:30 is when he begins his traffic-clogged commute home in the Bay Area. He decided at some point that he wanted to put this daily period of car confinement to good use, so he invented the 5:30 rule. The logistical simplicity of this system enables this executive to easily shift time-consuming, low-quality connections into higher-quality conversation. If you write him with a somewhat complicated question, he can reply, “I’d love to get into that. Call me at 5:30 any day you want.” Similarly, when I was visiting San Francisco a few years back and wanted to arrange a get-together, he replied that I could catch him on the phone any day at 5:30, and we could work out a plan. When he wants to catch up with someone he hasn’t spoken to in a while, he can send them a quick note saying, “I’d love to get up to speed on what’s going on in your life, call me at 5:30 sometime.” ... He hacked his schedule in such a way that eliminated most of the overhead related to conversation and therefore allowed him to easily serve his human need for rich interaction. (pp. 161-162)
I have to say, that strikes me as more selfish than clever. It's saying to everyone else that he will only communicate with them through his own preferred medium. Granted, it's his right to do so, and maybe he's learned that that's the best way he can get the most accomplished. But I'd have to be pretty desperate to call someone who I knew was going to be driving while he is talking with me. Either he's not going to be giving me his full attention, or he's not going to be giving the other cars on the road his full attention, neither one of which strikes me as ideal. And if I have a complicated question, I definitely want the response to be by written word, where there's a record of what was said, and more chance of getting a well thought out response.
I’ve seen several variations of this practice work well. Using a commute for phone conversations, like the executive introduced above, is a good idea if you follow a regular commuting schedule. It also transforms a potentially wasted part of your day into something meaningful. Coffee shop hours are also popular. In this variation, you pick some time each week during which you settle into a table at your favorite coffee shop with the newspaper or a good book. The reading, however, is just the backup activity. You spread the word among people you know that you’re always at the shop during these hours with the hope that you soon cultivate a rotating group of regulars that come hang out. ... You can also consider running these office hours once a week during happy hour at a favored bar. (pp. 162-163)
<Shudder> Really? I'm supposed to go to the expense, inconvenience, and annoyance of sitting around at a coffee shop or bar on spec, just hoping a friend shows up? And expect my friends to be willing to pay an insane amount for a cup of coffee just to talk with me? Here, and in many other places in Digital Minimalism, you can tell that Newport is an extrovert—with plenty of spare cash—and friends who are the same.
And anyway, whatever happened to visiting people in their homes? One friend of ours decided to quit Facebook, and in her final message invited anyone in town to drop by her house for tea. I could get into that. If you're willing to get out and drive to a restaurant, come instead and knock at our door. You'll be more than welcome and none one of us will have to buy an expensive drink. (This pandemic won't last forever.)
[In the early 20th century, Arnold Bennett, author of How to Live on 24 Hours a Day, speaking of leisure activities] argues that these hours should instead be put to use for demanding and virtuous leisure activities. Bennett, being an early twentieth-century British snob, suggests activities that center on reading difficult literature and rigorous self-reflection. In a representative passage, Bennett dismisses novels because they “never demand any appreciable mental application.” A good leisure pursuit, in Bennett’s calculus, should require more “mental strain” to enjoy (he recommends difficult poetry). (p. 175)
Newport approves of the idea that "the value you receive from a pursuit is often proportional to the energy invested." But then he adds,
For our twenty-first-century purposes, we can ignore the specific activities Bennett suggests. (p. 175)
And what, pray tell, is snobbish or unreasonable about literature and poetry?
Newport has a lot to say about the value of craft: of woodworking, or renovating a bathroom, or repairing a motorcycle, or knitting a sweater. He includes musical performances as well. But—and I find this odd for an author—he seems to have little respect for creating books. Would it be a more noble activity if they were typed on an old Remington, or handwritten? He similarly discounts composing music using a computer as less worthwhile than playing a guitar. I don't buy it.
The following story is for our two oldest grandsons, who have a way of picking up and enjoying construction skills.
[Pete's] welding odyssey began in 2005. At the time, he was building a custom home. ... The house was modern so Pete integrated some custom metalwork into his design plan, including a beautiful custom steel railing on the stairs.
The design seemed like a great idea until Pete received a quote from his metal contractor for the work: it was for $15,800, and Pete had budgeted only $4,000. “If this guy is billing out his metalworking time at $75.00 an hour, that’s a sign that I need to finally learn the craft myself,” Pete recalls thinking at the time. “How hard can it be?” In Pete’s hands, the answer turned out to be: not that hard.
Pete bought a grinder, a metal chop saw, a visor, heavy-duty gloves, and a 120-volt wire-feed flux core welder—which, as Pete explains, is by far the easiest welding device to learn. He then picked some simple projects, loaded up some YouTube videos, and got to work. Before long, Pete became a competent welder—not a master craftsman, but skilled enough to save himself tens of thousands of dollars in labor and parts. (As Pete explains it, he can’t craft a “curvaceous supercar,” but he could certainly weld up a “nice Mad-Max-style dune buggy.”) In addition to completing the railing for his custom home project (for much less than the $15,800 he was quoted), Pete went on to build a similar railing for a rooftop patio on a nearby home. He then started creating steel garden gates and unusual plant holders. He built a custom lumber rack for his pickup truck and fabricated a series of structural parts for straightening up old foundations and floors in the historic homes in his neighborhood. As Pete was writing his post on welding, a metal attachment bracket for his garage door opener broke. He easily fixed it. (pp. 194-195)
If you're wondering where to learn skills needed for simple projects ... the answer is easy. Almost every modern-day handyperson I've spoken to recommends the exact same source for quick how-to lessons: YouTube. (pp. 197-198, emphasis mine)
In the middle of a busy workday, or after a particularly trying morning of childcare, it’s tempting to crave the release of having nothing to do—whole blocks of time with no schedule, no expectations, and no activity beyond whatever seems to catch your attention in the moment. These decompression sessions have their place, but their rewards are muted, as they tend to devolve toward low-quality activities like mindless phone swiping and half-hearted binge-watching. ... Investing energy into something hard but worthwhile almost always returns much richer rewards. (p. 212)
Finally, I can't resist his description of former Kickstarter project called the Light Phone.
Here’s how it works. Let’s say you have a Light Phone, which is an elegant slab of white plastic about the size of two or three stacked credit cards. This phone has a keypad and a small number display. And that’s it. All it can do is receive and make telephone calls—about as far as you can get from a modern smartphone while still technically counting as a communication device.
Assume you’re leaving the house to run some errands, and you want freedom from constant attacks on your attention. You activate your Light Phone through a few taps on your normal smartphone. At this point, any calls to your normal phone number will be forwarded to your Light Phone. If you call someone from it, the call will show up as coming from your normal smartphone number as well. When you’re ready to put the Light Phone away, a few more taps turns off the forwarding. This is not a replacement for your smartphone, but instead an escape hatch that allows you to take long breaks from it. (p. 245).
Despite our areas of disagreement, there's only one really, really annoying section of the book. He spends seven pages on the ideas of someone named Jennifer who "prefers the pronoun 'they/their' to 'she/her'." The ideas are not worth the ensuing confusion between singular and plural. I found myself constantly re-reading trying to figure out who was being referenced in the text.
But I do recommend reading Digital Minimalism. The concept of solitude deprivation alone would make it worthwhile, and the rest of the book is pretty good, too—especially if you're not a phone-phobic, introverted author.
Permalink | Read 390 times | Comments (4)
Category Reviews: [first] [previous] [next] [newest] Education: [first] [previous] [next] [newest] Health: [first] [previous] [next] [newest] Computing: [first] [previous] [next] [newest] Children & Family Issues: [first] [previous] [next] [newest] Social Media: [first] [previous] [next] [newest]
This post is about 40 years late in coming. But I'm reading Cal Newport's Digital Minimalism, and was struck by something right at the beginning.
Newport reports that Bill Maher—some television personality I'd never heard of, but that's not the point—had a show in which he likened the deliberately-engineered addictiveness of social media to the deliberately-engineered addictiveness of tobacco. "Philip Morris just wanted your lungs," he concluded. "The App Store wants your soul." It could be argued that what both really want is your money, and I don't think Maher would disagree.
Back to Sesame Street. Maher could have looked a little further and realized that the whole television industry is just as guilty of enticing us to mainline its products. I pick on Sesame Street largely because it was so strongly sold as something good for children, and the pushers were not just the networks but teachers and doctors and social workers and neighbors.
Yet the show was filled with all the techniques that promote addiction, shorten attention spans, and actually change our brains, such as bright colors, fast, catchy music, and rapid-fire changes of focus. And when you think about it, what exactly was educational about the show? What did it teach?
Letter and numbers? Nothing that spending that time with parents and siblings couldn't have taught faster and better.
What it did teach effectively was a certain kind of socialization. Sesame Street was a neighborhood, and those who produced the show had very definite ideas about what makes a good neighborhood and how neighbors should behave. (Note that these ideas sometimes changed over time, with videos of the older shows now labelled "for adults only," because they show children riding bikes without helmets and walking to the store unaccompanied by an adult.)
If your family's own values happen to coincide with those of the show's creators, well and good. If you happen to think a complete stranger can do a better job of helping your child learn to deal with anger, grief, fear, death, divorce, illness, disability, and even love than you can, well, there you go.
But if you disagree, know that Sesame Street is a very effective propaganda machine that is teaching a whole lot more than the alphabet. Even Mister Rogers' Neighborhood, which I considered to be a far superior show, couldn't resist trying to act in loco parentis on personal and social issues. What goes on in today's children's shows I don't have the need (or stomach) to investigate.
How is it that we parents have become so timid and unsure of ourselves that we're eager to turn much of our children's character formation over to others?
Recently I stumbled upon The Conservative Student's Survival Guide. It's a five-minute video offering advice to—you guessed it—conservative students who find themselves a despised minority on liberal college campuses. That's no joke: for all the talk you'll hear from academia about tolerance, liberal values, and minority rights, it's a jungle out there if your particular minority isn't currently in favor, and it seems the only status more dangerous than "conservative student" on most American campuses is "conservative faculty." It was true when we were in college, it was true when our children were in college—and everything I see leads me to believe the situation is far, far worse now.
What's surprising about this video is that, unlike much that comes from both Left and Right these days, it is calm, well-reasoned, and respectful. What's more, even though it's aimed at conservative students, any thoughtful person who wants to make the most of his college experience would do well to consider this advice.
The speaker is Matthew Woessner, a Penn State political science professor. All of his seven suggestions make sense, but my top three are these:
- Avoid pointless ideological battles. It's not your job to convert your professors or your fellow students. Discuss and debate, but don't push too hard.
- Choose [your classes and your major] wisely. I was a liberal atheist in college, but much on campus was too far Left even for me. Being a student of the hard sciences saved me from a great deal of the insanity that was going on in the humanities and social sciences departments. A quarter-century later, one of our daughters found some of the same relief as an engineering major. Our other daughter, however, discovered that life at a music conservatory was quite difficult—despite the name, conservative values were not welcome.
- Work hard—college faculty value hard-working, enthusiastic students. I'd say this is the most valuable of all his points. Excellence and enthusiasm are attractive. A student who participates respectfully in class, does the work, and learns the material will gain the respect and appreciation of most of his professors. Teachers are like that.
C. S. Lewis wrote the Preface to a book by B. G. Sandhurst entitled How Heathen is Britain? This essay has been republished as the thirteenth chapter of Lewis's book, God in the Dock, which I recently finished re-reading. I deemed the excerpts below too extensive for my review of that book, so here they are in their own post.
The essay, written in the mid-1940's, deals largely with the effect of state education on students' beliefs and attitudes about the Christian faith. A few quotes can't do justice to the logic of the argument, but should suffice to give the flavor. All bold emphasis is my own.
The content of, and the case for, Christianity, are not put before most schoolboys under the present system; ... when they are so put a majority find them acceptable. ... [These two facts] blow away a whole fog of "reasons for the decline of religion" which are often advanced and often believed. If we had noticed that the young men of the present day found it harder and harder to get the right answers to sums, we should consider that this had been adequately explained the moment we discovered that schools had for some years ceased to teach arithmetic. (p. 115)
The sources of unbelief among young people today do not lie in those young people. The outlook which they have—until they are taught better—is a backwash from an earlier period. It is nothing intrinsic to themselves which holds them back from the Faith. This very obvious fact—that each generation is taught by an earlier generation—must be kept very firmly in mind. (p. 116)
No generation can bequeath to its successor what it has not got. You may frame the syllabus as you please. But when you have planned and reported ad nauseam, if we are skeptical we shall teach only skepticism to our pupils, if fools only folly, if vulgar only vulgarity, if saints sanctity, if heroes heroism. ... Nothing which was not in the teachers can flow from them into the pupils. (p. 116)
A society which is predominantly Christian will propagate Christianity through its schools: one which is not, will not. All the ministries of education in the world cannot alter this law. We have, in the long run, little either to hope or fear from government.
The State may take education more and more firmly under its wing. I do not doubt that by so doing it can foster conformity, perhaps even servility, up to a point; the power of the State to deliberalize a profession is undoubtedly very great. But all the teaching must still be done by concrete human individuals. The State has to use the men who exist. Nay, as long as we remain a democracy, it is men who give the State its powers. And over these men, until all freedom is extinguished, the free winds of opinion blow. Their minds are formed by influences which government cannot control. And as they come to be, so will they teach. ... Let the abstract scheme of education be what it will: its actual operation will be what the men make it. ... Your "reform" may incommode and overwork them, but it will not radically alter the total effect of their teaching. (pp. 116-117)
Where the tide flows towards increasing State control, Christianity, with its claims in one way personal and in the other way ecumenical and both ways antithetical to omnicompetent government, must always in fact (though not for a long time yet in words) be treated as an enemy. Like learning, like the family, like any ancient and liberal profession, like the common law, it gives the individual a standing ground against the State. Hence Rousseau, the father of the totalitarians, said wisely enough, from his own point of view, of Christianity, Je ne connais rien de plus contraire à l'esprit social ("I know nothing more opposed to the social spirit"). ... Even if we were permitted to force a Christian curriculum on the existing schools with the existing teachers we should only be making masters hypocrites and hardening thereby the pupils' hearts. (p. 118)
I am speaking, of course, of large schools on which a secular character is already stamped. If any man, in some little corner out of the reach of the omnicompetent, can make, or preserve a really Christian school, that is another matter. His duty is plain. (p. 119)
What a society has, that, be sure, and nothing else, it will hand on to its young. The work is urgent, for men perish around us. But there is no need to be uneasy about the ultimate event. As long as Christians have children and non-Christians do not, one need have no anxiety for the next century. (p. 119)
Clearly Lewis did not anticipate that Christians would embrace the radical move to very small families nearly as much as secular society did. I'm thankful for those who are now reversing that trend. The idea is mocked today ("evangelism by procreation"), but Lewis—though he had a difficult home life and no biological children of his own—clearly recognized the life- and faith-affirming value of begetting and bearing children in Christian families.
As for the rest of the quotations: it is still true that democratic governments have much less control over what children think and learn than they would like. But the same is now also true of teachers. Lewis was thinking of the influence of teachers when he wrote,
Planning has no magic whereby it can elicit figs from thistles or choke-pears from vines. The rich, sappy, fruit-laden tree will bear sweetness and strength and spiritual health; the dry, prickly, withered tree will teach hate, jealousy, suspicion, and inferiority ... [no matter what] you tell it to teach. (pp. 117-118)
This is even more true, now, of the movies, music, and other media that are the very air our young people breathe (and rarely think about), and of the peer-oriented society we have bequeathed them. As I wrote in my review of Gordon Neufeld and Gabor Maté's book Hold On to Your Kids (a book I strongly recommend to all parents, grandparents, teachers, pastors, and anyone else who cares about children),
It is essential to the survival of a civilization that its culture be passed on from one generation to another. Today's children are not receiving culture, they are inventing it as they go along. We are into the third generation of this problem, and appear to be reaching a tipping point. If the idea of peer culture being more important to children than their family culture doesn't seem strange and wrong to us, it's because that's how we grew up, too.
I've never liked William Golding's book, The Lord of the Flies. Why high school English teachers think it helpful to assault students' spirits with the distressing imaginations of disturbed minds, I cannot imagine. My daughter hated it even more than I did, and although she finished reading the book weeks before the school exam, she steadfastly refused to reread or study it in any way. She'd rather fail, she insisted. (Actually, she aced the test. Having a good memory is both a curse and a blessing.)
The Lord of the Flies certainly hit a chord with popular society, and like it or not has become part of our culture. Say to someone, "it's a Lord of the Flies situation there," and he knows exactly what you mean. It has also contributed to a good deal of negative and cynical thinking.
My sister-in-law, who knows my feelings on the matter, sent me this marvellous story about a real-life event that illustrates just the opposite about human behavior. (Warning: if you read the article, you may have to ignore some incidental, intense political ranting; I don't know what extras might be showing when you get there, but the last time I saw it I almost decided not to include the link. But credit must go where credit it due.) Here are some excerpts with the bare bones of the story:
This story [of the degradation and brutal behavior of castaway English schoolboys] never happened. An English schoolmaster, William Golding, made up this story in 1951—his novel Lord of the Flies would sell tens of millions of copies, be translated into more than 30 languages and hailed as one of the classics of the 20th century. In hindsight, the secret to the book’s success is clear. Golding had a masterful ability to portray the darkest depths of mankind. Of course, he had the zeitgeist of the 1960s on his side. (emphasis mine)
Years later when I began delving into the author’s life. I learned what an unhappy individual he had been: an alcoholic, prone to depression. “I have always understood the Nazis,” Golding confessed, “because I am of that sort by nature.” And it was “partly out of that sad self-knowledge” that he wrote Lord of the Flies.
I began to wonder: had anyone ever studied what real children would do if they found themselves alone on a deserted island? I wrote an article on the subject, in which I compared Lord of the Flies to modern scientific insights and concluded that, in all probability, kids would act very differently. Readers responded sceptically. All my examples concerned kids at home, at school, or at summer camp. Thus began my quest for a real-life Lord of the Flies. After trawling the web for a while, I came across an obscure blog that told an arresting story: “One day, in 1977, six boys set out from Tonga on a fishing trip ... Caught in a huge storm, the boys were shipwrecked on a deserted island. What do they do, this little tribe? They made a pact never to quarrel.”
It took some work to uncover the source of the story, as the date given was incorrect; the real year was 1966.
The real Lord of the Flies ... began in June 1965. The protagonists were six boys—Sione, Stephen, Kolo, David, Luke and Mano—all pupils at a strict Catholic boarding school in Nuku‘alofa [the capital of Tonga]. The oldest was 16, the youngest 13, and they had one main thing in common: they were bored witless. So they came up with a plan to escape: to Fiji, some 500 miles away, or even all the way to New Zealand.
The boys "borrowed" a small sailing boat, and their voyage started out fine, with fair skies and a mild breeze.
But that night the boys made a grave error. They fell asleep. A few hours later they awoke to water crashing down over their heads. It was dark. They hoisted the sail, which the wind promptly tore to shreds. Next to break was the rudder. “We drifted for eight days,” Mano told me. “Without food. Without water.” The boys tried catching fish. They managed to collect some rainwater in hollowed-out coconut shells and shared it equally between them, each taking a sip in the morning and another in the evening. Then, on the eighth day, they spied a miracle on the horizon. A small island, to be precise. Not a tropical paradise with waving palm trees and sandy beaches, but a hulking mass of rock, jutting up more than a thousand feet out of the ocean.
These days, the island is considered uninhabitable. But by the time the boys were rescued, 15 months later,
[They] had set up a small commune with food garden, hollowed-out tree trunks to store rainwater, a gymnasium with curious weights, a badminton court, chicken pens and a permanent fire, all from handiwork, an old knife blade and much determination.” While the boys in Lord of the Flies come to blows over the fire, those in this real-life version tended their flame so it never went out, for more than a year.
The kids agreed to work in teams of two, drawing up a strict roster for garden, kitchen and guard duty. Sometimes they quarrelled, but whenever that happened they solved it by imposing a time-out. Their days began and ended with song and prayer. Kolo fashioned a makeshift guitar from a piece of driftwood, half a coconut shell and six steel wires salvaged from their wrecked boat ... and played it to help lift their spirits. And their spirits needed lifting. All summer long it hardly rained, driving the boys frantic with thirst. They tried constructing a raft in order to leave the island, but it fell apart in the crashing surf.
Worst of all, Stephen slipped one day, fell off a cliff and broke his leg. The other boys picked their way down after him and then helped him back up to the top. They set his leg using sticks and leaves. “Don’t worry,” Sione joked. “We’ll do your work, while you lie there like King Taufa‘ahau Tupou himself!”
They survived initially on fish, coconuts, tame birds (they drank the blood as well as eating the meat); seabird eggs were sucked dry. Later, when they got to the top of the island, they found an ancient volcanic crater, where people had lived a century before. There the boys discovered wild taro, bananas and chickens (which had been reproducing for the 100 years since the last Tongans had left).
They were finally rescued on Sunday 11 September 1966. The local physician later expressed astonishment at their muscled physiques and Stephen’s perfectly healed leg.
The article has much more of the story, including how the rescued boys were immediately clapped into jail for having stolen the boat. (It ends well.)
This is a much better story than The Lord of the Flies, and it's true. Sadly, it's unlikely to have the social impact of Golding's book. But I hope all English teachers who insist on teaching Golding will be inspired to include the real story in their discussions. At the very least, parents now have an antidote to offer their distressed children.
And we all have an antidote to the evening news and social media.
The school lunchbox is dead in Italy.
The Italian Supreme Court has ruled against parents who want to send lunch to school with their children. Their logic? Not eating the school-provided lunch is "a possible violation of the principles of equality and non-discrimination based on economic circumstances."
Even the United States isn't that crazy—yet—despite pushes in that direction by busybodies experts who worry that food from home might not be "good enough," and school-lunch providers who have a deep financial stake in forcing parents to buy their product.
Parents, naturally, are not happy.
Lorenza, who has two children at a Turin school, told a local TV station she spent more than €2,000 (£1,823) on school meals, more than her monthly salary. "My older daughter was not happy because the quality of the food didn't justify the cost, and also because of the hygiene issues with the canteen. "She would often complain that the cutlery was dirty, that the glasses were not particularly clean, or that there would be hairs on the plates," she said.
As with many news reports, this paragraph does not give enough information for us to know just how outraged we should be. Over what time period did this mother spend $2200 dollars? One month, as implied by the comment that the cost was "more than her monthly salary"? Annually per child? Over the entire school experience of all of her (possibly, though not likely, many) children?
Never mind. It doesn't matter. Even if the meals were totally free (where by "free" we mean paid for by other people, of course), it would still be an outrage.
School lunches may be a necessity for some children, who would otherwise not eat—though I've never been able to answer satisfactorily a friend's question, "Isn't that what SNAP (formerly Food Stamps) and WIC programs are all about? Why do we also need free school lunches?"
School lunches are certainly a convenience for busy parents—though there is no reason why a child of school age shouldn't be able to pack his own lunch.
But there was never any doubt in my mind that my own packed lunch was vastly superior to what was offered in the school cafeteria, and apparently our children thought so, too. Even if they often traded their carrot sticks to other children for cookies—at least some child was eating healthful food. I'm reminded of one family I know who qualified for free meals for their children. The children gave it a try, determined that the food at home was better tasting, more nutritious, and even more plentiful—and wisely opted out. At least here they had that option.
More to the point: whatever the Italian Supreme Court may say, being able to feed our children as we think best is a basic, human, family right—right up there with being able to birth, educate, and otherwise rear our children as we think best. As all totalitarian governments know, once you come between parents and their children, most other freedoms become meaningless.
For those families who cannot or will not handle these responsibilities on their own, we rightly make assistance available. That's called charity. But forcing that "assistance" on those who do not want it? That's called tyranny.
And the "principles of equality" the court found so important? Should we make everyone feed their babies formula because some mothers can't or won't breastfeed? Dumb down the school curriculum to the lowest common denominator? Put every child in daycare because some families need that service? Force every child into public school because some parents can't or won't provide private or home education? Make every woman give birth in a hospital because some babies need a doctor's care? Ban unpasteurized milk, orange juice, and cider because not everyone has access to safe sources of these delicious drinks? Forbid handmade clothing because not every mother can sew? Put handicapping weights on the feet of the best dancers to eliminate their advantage over the klutzes?
Oh, wait. Objects in the mirror are closer than they appear.
People on Facebook and elsewhere have been wishing Florida students "happy first day of school." Leaving aside that I agree with C. S. Lewis that "the putting on of the school clothes was, I well knew, the assumption of a prison uniform," and that I am so glad to be past that part of our lives, I just have to say that August 12—the start date for many here—is a ridiculous day for the school year to commence.
Here in Florida it's not so bad, as all the buildings are air conditioned, and summer isn't the nicest season of the year anyway. But when our kids were in school and the district flirted with starting mid-summer, our kids had to choose between skipping some wonderful summer educational programs elsewhere in the country and skipping the beginning of school. (We chose the latter, but would rather not have had to do that.) Perhaps I shouldn't complain too much about that, however, or someone will suggest that school schedules should be set nationally, and I'm highly in favor of local control of schools. If people are going to wear chains, at least let those chains be of different colors.
Someone pointed out that we make up for starting early by getting out at the end of May, which is true. There's something to be said for that, though I'm not sure why one would cut off days at one end of summer just to sew them back on to the other end. The weather in June is sometimes nicer than in August, but you sure can't count on it. Still, shifting the calendar is at least better than the other thing schools have been doing: shrinking summer vacation and adding vacation days here and there throughout the year. I'm of two minds there. Granted, it's lovely to have days off in the middle of the school year, especially when the weather is nicer.
But nothing beats the traditional long, idyllic stretch of the summer, where the days are free for reading, exploring, playing pick-up games with the neighbors, or just stretching out on the ground (or up in the treehouse) and watching the sky. The summer mindset doesn't come quickly. I noticed with our own children that a week's vacation from school wasn't nearly enough, because the beginning of the week was filled with what we called detoxification—as the children re-learned to order their own days—and the end with anticipation of the return to school. Summer was long enough for freedom to take hold in our hearts. I suspect teachers feel much the same way: after each return to school, it takes students time to settle back in, and as an anticipated vacation approaches, their focus is broken. Time is wasted when durations are too short.
All that aside: Be you student, teacher, or parent, if you've chosen (or had chosen for you) the life of being tied to the School Year—I do wish you the best: a happy first day of school and all the rest of them as well.
It's been a while since I posted in my Conservationist Living category, which is this post's primary classification, though I've assigned it to several others as well.
America is going to hell, right? Everybody says so. Including a whole lot of people who fervently believe there is no such place as hell, which is an interesting conundrum. But they all believe with equal fervor that we are going there rapidly. Believer or non-believer, left-wing or right-wing, we are convinced that we're in bad shape and on course to get much, much worse. What we disagree on is the attitudes, events, actions, and pathways that are taking us hell-ward.
Believe me, I'm not immune to such pessimism. Neither are you, so I'm going to tell you a small part of the story of Dave Anderson.
The Andersons are friends of our daughter's family, from their Pittsburgh days. Dave's success at building a good life for his family while reclaiming a worn-out strip mine and putting to good use many hundreds of tons of refuse every year was featured last month in this Pittsburgh Post-Gazette article.
I made the 45-minute drive to Echo Valley Farm this week because I wanted to meet the man who’d turned strip-mined land in northwest Beaver County into 26 grassy acres on which beef cattle thrive. Mr. Anderson had told me he revived his land by mixing hundreds of thousands of used paper cups from the Pittsburgh Marathon with manure, hay, banana peels and restaurant refuse.
You'll want to read the whole article to learn about the symbiotic relationship between the farm, needing nourisment, and both private businesses and local governments, needing waste disposal, that's a win for everyone involved.
It all works because there’s something in it for everyone. Mr. Anderson said that 14 years ago, the field over my shoulder produced 6,000 pounds of hay at the first cutting. The cutting in [the] same field last year brought 37,000 pounds.
Plus, the farm is a great place to raise kids.
I ask if it’s just the two of them and he says, no, he and his wife, Elaine, have six girls and a boy. They range in age from 10 to 24. All seven of them comprise Echo Valley, a bluegrass/gospel/Celtic band, that just played in Harrodsburg, Ky., Saturday night.
Several years ago—it was probably more than ten, though I'm finding that hard to believe—we visited the budding farm for one of their many social gatherings of food, music, and fun. Kids and animals were everywhere. The children were much younger then, of course, but they were already solid musicians. Here is a more recent video of the group.
and one of my favorites from earlier, just for fun.
Mr. Anderson, an inveterate reader who doesn’t own a TV, and who also was an air traffic manager until he retired last Friday, figured out how to turn desolate land into a lush farm that supports a family of nine with 30 head of beef cattle, six miniature donkeys, 40 laying hens, two turkeys, four guinea fowl, three geese, three ducks, two Australian cattle dogs and six pups.
Not to mention a number of cats, as I recall.
I hope this brightened your day. If America is, indeed, going to hell, people like the Andersons are pulling mightily in the other direction.
Permalink | Read 446 times | Comments (4)
Category Politics: [first] [previous] [next] [newest] Children & Family Issues: [first] [previous] [next] [newest] Food: [first] [previous] [next] [newest] Conservationist Living: [first] [previous] [newest] Inspiration: [first] [previous] [next] [newest]
There's no doubt that video games and manipulating phones and tablets develop certain skills. But if we think all that button pushing and finger-swiping are improving manual dexterity, apparently it's not doing so in ways that still matter greatly—such as the skills needed by a surgeon.
Roger Kneebone, professor of surgical education at Imperial College, London, says young people have so little experience of craft skills that they struggle with anything practical. ... "It is a concern of mine and my scientific colleagues that whereas in the past you could make the assumption that students would leave school able to do certain practical things—cutting things out, making things—that is no longer the case," says Prof Kneebone.
Prof Kneebone says he has seen a decline in the manual dexterity of students over the past decade. ... Students have become "less competent and less confident" in using their hands, he says. ... "We have students who have very high exam grades but lack tactile general knowledge."
Such skills might once have been gained at school or at home, whether in cutting textiles, measuring ingredients, repairing something that's broken, learning woodwork or holding an instrument.
Is this something to be gravely concerned about, or will we simply turn surgery over to robots the way we have turned shifting the gears in our cars over to automatic transmissions?
About once a year or so we actually go out to a theater and watch a movie. I knew I wanted to see Unplanned, and did not have any confidence that it would eventually make it to Netflix. So Porter bought tickets online for our local AMC theater, and we made a date of it.
"Date" is an appropriate word, because despite the seriousness of the subject and a couple of horrifying scenes that probably earned it its "R" rating, Unplanned is basically a love story: The unconditional love of parents for a child who has made lifestyle choices in complete opposition to their own deeply-held values; the steadfast love of a man in support of his wife despite his conviction that her chosen career path is an immoral one; the love that leads us to embrace our common humanity in the face of chasmic differences; and the relentless love of God for his hurting world—"unresting, unhasting, and silent as light."
Abby Johnson's desire to make a difference in the world, to support the rights of women, and to help women in crisis situations led her, beginning as a student volunteer at the local Planned Parenthood clinic, to a promising career with that organization. She became one of the youngest-ever clinic directors, and won an Employee of the Year award in 2008.
And then that same heart-felt desire to help women led her to quit. Unplanned is her story.
The story is well told. The movie is beautiful—except of course where it's ugly. I particularly like the fact that it is not a black-and-white, one-dimensional story of a sudden conversion, despite the "what she saw changed everything" subtitle. As much as can be done in a movie less than two hours in length, we see Abby's growth through time and experience. Her change of heart seems more of a tipping point than a crisis, though there are certainly elements of the latter as well. Abby at the end of the movie is more knowledgeable, more experienced, certainly less naïve, and moving in a different direction in more than one area of her life—but still Abby.
The only fault I find is the portrayal of Abby's boss, who is indeed one-dimensional; we never see her human side. It reminds me of what C. S. Lewis said about George MacDonald, that he was rare among authors in being able to portray good much better than evil: "His saints live; his villains are stagey." It's certainly possible that this woman was as nasty as she seems, and as I said, it's a short movie, but I would like to have seen something redeeming about her character.
Do I recommend seeing Unplanned when you have the chance? Absolutely, 100%, a hundred times over. Do I recommend it for our grandchildren? Eventually. They're all under age for the rating at this point, anyway. Maybe the oldest one or two could handle it well, if their parents watch the film first and agree. Anyone younger than that would be traumatized, maybe scarred for life—if they understood it at all. At first I wondered about the R rating, given the horrible things I've seen in PG-13 movies, but I believe the MPAA got it right in this case. Unplanned is a beautiful movie, and an important one, but there's no denying that it's disturbing in a way no child should be asked to handle. Not that so many kids haven't already seen worse. And it's rather bizarre to require parental consent for a child watch a movie with a few abortion scenes, when that same child could actually have an abortion without it.
It took a long time for me to dip my toes into the DNA testing waters, being both an avid genealogist and a very private person. But just as giving birth changed my relationship to modesty, starting a blog changed my relationship to privacy. I'm still both modest and private, but not in the same way. The biggest obstacle to DNA testing was knowing I was dragging my family along. As recent events have shown, criminal behavior (and other indiscretions) can be found out by DNA through relatives' information available on genealogy websites.
But I discovered long ago that privacy as we knew it is dead. I remember working with a family researcher who was writing a book on one side of our family. At one time, I would have refused to contribute any information, but had since been helped so much in my research by a book on the Wightman Family that I wanted to help others the same way. The Wightman book, incidentally, has information on me and our family that was contributed without my knowledge or consent. At the time I was not happy, but I got over that and now appreciate it. Except for where the data is wrong....
The point, however, is that while such direct contributions help researchers, they're not all that necessary. When one of my family members declined to contribute his family's information to the project I was helping with, the researcher understood his reluctance—but he added, "Let me show you the information I've already obtained from public sources." He already had just about everything he could use. As Illya Kuryakin Dr. Mallard said on NCIS last night, The Internet will be the death of us. Or at least of privacy.
In light of all this, Porter and I each decided to submit a sample to AncestryDNA.com, and eagerly awaited the results. Later we uploaded the DNA data to MyHeritage.com, and eventually gave another sample to 23andMe.com—the latter for both the ancestry and the health screening.
This post is not for a detailed analysis of the results, but an overall impression of the value of the DNA testing. First, from the point of view of genealogy.
For us, the Ancestry.com screening was the most useful. This is for two reasons.
- They have the largest database from which to work, and that is what makes the testing useful—comparing your DNA to that of other populations. For this reason it is also most useful for those of European background, because of the large numbers of that population who have participated. The testing services are working to improve the experience for under-represented populations, but for now the data is not so robust.
- I have uploaded our family tree, with its nearly 15,000 individuals, to Ancestry.com, and that's largely what makes their DNA service helpful for genealogy. This gives context to our DNA matches, and I've already confirmed known relatives while learning of several more. My tree is at the moment private on Ancestry, which means people have to ask me about the information, which is a good way to get to meet them. Someday I will make it, or at least a version of it, public, but the tree itself isn't ready for that exposure yet.
No doubt MyHeritage would be more useful if I put a tree up there as well, but that's on the "Someday/Maybe" list. I only uploaded our data because at the time they gave free access to their resources if you did. So far they've only found us "third-to-fifth cousins"—tons of them—which is not of much use without trees to compare, and most people seem to have no trees or very small ones. Third cousins share a great-great-grandfather, so it requires a significant amount of family history knowledge to make the connection.
23andMe is in the same situation as far as genealogy goes. So far nothing found even as close as second cousin (sharing a great-grandfather).
How has this helped my genealogy research? Well, through Ancestry.com I've connected with a few previously unknown cousins, a couple close enough to be useful in sharing information. Even the ones that are more distant have been useful in providing some confirmation of my research. Overall I'm glad I took the plunge, if only for this reason. It also has a lot of potential for more and better information as time goes on. One important caveat: There is a lot of error in online family trees. Even with DNA support, this information is best taken as inspiration for further research, and for mutual sharing of data sources.
Now for what most people want out of DNA testing: heritage and ethnicity information. This is an estimate only, and each company has its own data and algorithm for making its "best guess." Sometime after we had our samples analyzed, Ancestry.com upgraded their system and re-analyzed our data. The results were not terribly much different from the first attempt, though probably more accurate.
The analysis from MyHeritage was closer to Ancestry's original analysis. That from 23andMe was different from any of the others, though quite similar overall.
My impression? The DNA analysis is very good as an overall picture, not so good on the details. For example, Porter's great-grandparents came to the United States from Sweden, and it is well known where they lived before emigrating. In fact, when his dad visited Sweden, he was told he looks just like people who live in that area. Thus when his father's AncestryDNA analysis came back showing his largest ethnicity to be Norwegian, we were taken aback. However, the area he's from may be called Sweden, but it's right on the border with Norway. One can definitely say from his DNA that he is of Scandinavian origin, but that he is specifically Swedish comes from genealogy. One must also remember that the smaller percentages are suspect: of the three analyses, 23andMe was the only one that gave me "broadly East Asian and Native American" ancestry, and that was at just 0.1%, so highly doubtful.
Finally, there's the analysis of genetic health data. This comes primarily from 23andMe, though we also paid an extra $10 post facto for Ancestry's "Traits" screening. I've written about the latter experience before. 23andMe analyzes many more traits than Ancestry's small sample, from "Leigh Syndrome, French Canadian Type" carrier status, to estimated risk for late-onset Alzheimer's Disease, to Lactose Intolerance, to Asparagus Odor Detection.
My thoughts? Interesting, but not quite ready for prime time. Where I have independent data it sometimes confirms, sometimes contradicts the DNA reports. Ancestry says I likely have a "unibrow" but 23andMe says the opposite. Both of them say I probably hate cilantro, and I love it. And so on. So I'm taking the rest of what they say with a few grains of salt. I'm sure there's something to it, and that the data will get better with time, but for now it is more entertainment than useful information. Actually, I take that back: Just as DNA ancestry data is useful as a starting point for further research, the discovery of certain traits might be useful for suggesting further, medical, genetic testing.
There's a lot more to DNA analysis for the serious genealogy researcher to investigate, such as sites that will take your data and give you tools to learn much more about which particular genes you and a DNA match share. I'm not there yet; I have too much to do with my regular research to explore that path further. But it, and my data, are there when I'm ready.
Am I glad I decided to "spit in the tube"? Absolutely; I'd do it again and may later go further with it. I'm very grateful to family members who have taken the plunge as well, because that provides a look at the puzzle from more angles. But it's always important not to expect too much. It's never as simple as trading your kilt for lederhosen, as the Ancestry.com ad blithely shows. Plus there's a risk of finding out things you don't want to know—about family or about health. It's a very personal decision and I understand those who are reluctant to take the risk.