Digital Minimalism: Choosing a Focused Life in a Noisy World by Cal Newport (Portfolio/Penguin 2019)
Janet recommended this one to me, and after checking out Newport's TED talk, "Why You Should Quit Social Media," I decided to reserve it at the library. I had to wait in line; maybe more than a few people are rethinking Facebook, Twitter, Instagram, Old Uncle Tom Cobleigh and all.
Digital Minimalism is divided into two parts: Foundations, and Practices. I read through Foundations easily, able to enjoy the book without pasting sticky tabs all over it. For me, this is like going somewhere and not taking pictures. Those sticky notes represent text that I will later laboriously transcribe for my reviews. As with the photos, something is gained but something is lost. I was enjoying the book and anticipating an easy review.
Then I hit Practices. Or Practices hit me.
The first chapter of that section, "Spend Time Alone," is about solitude deprivation. I could have sticky-noted the whole chapter. Here is me, restraining myself:
Everyone benefits from regular doses of solitude, and, equally important, anyone who avoids this state for an extended period of time will ... suffer. ... Regardless of how you decide to shape your digital ecosystem, you should give your brain the regular doses of quiet it requires to support a monumental life. (pp. 91-92).
[Raymond] Kethledge is a respected judge serving on the United States Court of Appeals for the Sixth Circuit, and [Michael] Erwin is a former army officer who served in both Iraq and Afghanistan. ... [Their book on the topic of solitude], Lead Yourself First ... summarizes, with the tight logic you expect from a federal judge and former military officer, [their] case for the importance of being alone with your thoughts. Before outlining their case, however, the authors start with what is arguably one of their most valuable contributions, a precise definition of solitude. Many people mistakenly associate this term with physical separation—requiring, perhaps, that you hike to a remote cabin miles from another human being. This flawed definition introduces a standard of isolation that can be impractical for most to satisfy on any sort of a regular basis. As Kethledge and Erwin explain, however, solitude is about what’s happening in your brain, not the environment around you. Accordingly, they define it to be a subjective state in which your mind is free from input from other minds. (pp. 92-93)
You can enjoy solitude in a crowded coffee shop, on a subway car, or, as President Lincoln discovered at his cottage, while sharing your lawn with two companies of Union soldiers, so long as your mind is left to grapple only with its own thoughts. On the other hand, solitude can be banished in even the quietest setting if you allow input from other minds to intrude. In addition to direct conversation with another person, these inputs can also take the form of reading a book, listening to a podcast, watching TV, or performing just about any activity that might draw your attention to a smartphone screen. Solitude requires you to move past reacting to information created by other people and focus instead on your own thoughts and experiences—wherever you happen to be. (pp. 93-94).
Regular doses of solitude, mixed in with our default mode of sociality, are necessary to flourish as a human being. It’s more urgent now than ever that we recognize this fact, because ... for the first time in human history solitude is starting to fade away altogether. (p. 99)
The concern that modernity is at odds with solitude is not new. ... The question before us, then, is whether our current moment offers a new threat to solitude that is somehow more pressing than those that commentators have bemoaned for decades. ... To understand my concern, the right place to start is the iPod revolution that occurred in the first years of the twenty-first century. We had portable music before the iPod ... but these devices played only a restricted role in most people’s lives—something you used to entertain yourself while exercising, or in the back seat of a car on a long family road trip. If you stood on a busy city street corner in the early 1990s, you would not see too many people sporting black foam Sony earphones on their way to work. By the early 2000s, however, if you stood on that same street corner, white earbuds would be near ubiquitous. The iPod succeeded not just by selling lots of units, but also by changing the culture surrounding portable music. It became common, especially among younger generations, to allow your iPod to provide a musical backdrop to your entire day—putting the earbuds in as you walk out the door and taking them off only when you couldn’t avoid having to talk to another human. (pp. 99-100).
This transformation started by the iPod, however, didn’t reach its full potential until the release of its successor, the iPhone.... Even though iPods became ubiquitous, there were still moments in which it was either too much trouble to slip in the earbuds (think: waiting to be called into a meeting), or it might be socially awkward to do so (think: sitting bored during a slow hymn at a church service). The smartphone provided a new technique to banish these remaining slivers of solitude: the quick glance. (p. 101)
When you avoid solitude, you miss out on the positive things it brings you: the ability to clarify hard problems, to regulate your emotions, to build moral courage, and to strengthen relationships. (p. 104)
Eliminating solitude also introduces new negative repercussions that we’re only now beginning to understand. A good way to investigate a behavior’s effect is to study a population that pushes the behavior to an extreme. When it comes to constant connectivity, these extremes are readily apparent among young people born after 1995—the first group to enter their preteen years with access to smartphones, tablets, and persistent internet connectivity. ... If persistent solitude deprivation causes problems, we should see them show up here first. ...
The head of mental health services at a well-known university ... told me that she had begun seeing major shifts in student mental health. ... Seemingly overnight the number of students seeking mental health counseling massively expanded, and the standard mix of teenage issues was dominated by something that used to be relatively rare: anxiety. ... The sudden rise in anxiety-related problems coincided with the first incoming classes of students that were raised on smartphones and social media. She noticed that these new students were constantly and frantically processing and sending messages. ...
[San Diego State University psychology professor Jean Twenge observed that] young people born between 1995 and 2012 are ... on the brink of the worst mental-health crisis in decades. ... [She] made it clear that she didn’t set out to implicate the smartphone: “It seemed like too easy an explanation for negative mental-health outcomes in teens,” but it ended up the only explanation that fit the timing. Lots of potential culprits, from stressful current events to increased academic pressure, existed before the spike in anxiety.... The only factor that dramatically increased right around the same time as teenage anxiety was the number of young people owning their own smartphones. ...
When journalist Benoit Denizet-Lewis investigated this teen anxiety epidemic in the New York Times Magazine, he also discovered that the smartphone kept emerging as a persistent signal among the noise of plausible hypotheses. “Anxious kids certainly existed before Instagram,” he writes, “but many of the parents I spoke to worried that their kids’ digital habits—round-the-clock responding to texts, posting to social media, obsessively following the filtered exploits of peers—were partly to blame for their children’s struggles.” Denizet-Lewis assumed that the teenagers themselves would dismiss this theory as standard parental grumbling, but this is not what happened. “To my surprise, anxious teenagers tended to agree.” A college student he interviewed at a residential anxiety treatment center put it well: “Social media is a tool, but it’s become this thing that we can’t live without that’s making us crazy.” (pp. 104-107)
The pianist Glenn Gould once proposed a mathematical formula for this cycle, telling a journalist: “I’ve always had a sort of intuition that for every hour you spend with other human beings you need X number of hours alone. Now what that X represents I don’t really know . . . but it’s a substantial ratio.” (p. 111)
The past two decades ... are characterized by the rapid spread of digital communication tools—my name for apps, services, or sites that enable people to interact through digital networks—which have pushed people’s social networks to be much larger and much less local, while encouraging interactions through short, text-based messages and approval clicks that are orders of magnitude less information laden than what we have evolved to expect. ... Much in the same way that the “innovation” of highly processed foods in the mid-twentieth century led to a global health crisis, the unintended side effects of digital communication tools—a sort of social fast food—are proving to be similarly worrisome.(p. 136).
After winning me over with the chapter on solitude deprivation, Newport lost me somewhat with his approach to taming the beasts. The basic problem is that, for a guy who has written several books and has his own blog, he seems to have very little respect for the written word.
Many people think about conversation and connection as two different strategies for accomplishing the same goal of maintaining their social life. This mind-set believes that there are many different ways to tend important relationships in your life, and in our current modern moment, you should use all tools available—spanning from old-fashioned face-to-face talking, to tapping the heart icon on a friend’s Instagram post.
The philosophy of conversation-centric communication takes a harder stance. It argues that conversation is the only form of interaction that in some sense counts toward maintaining a relationship. This conversation can take the form of a face-to-face meeting, or it can be a video chat or a phone call—so long as it matches Sherry Turkle’s criteria of involving nuanced analog cues, such as the tone of your voice or facial expressions. Anything textual or non-interactive—basically, all social media, email, text, and instant messaging—doesn’t count as conversation and should instead be categorized as mere connection. (p. 147)
I heartily disagree with his lumping e-mail in with "all social media, text, and instant messaging." I will grant that most social media, texts, WhatsApp, IM, and the like are severely limited by the difficulty of creating the message. Phones simply are not designed for high-speed typing, and I don't know about other people's experiences, but for me voice-to-text makes so many errors I spend almost as much time correcting as I would have laboriously pecking out a message on the tiny keyboard. (That's why I much prefer WhatsApp, where I can type my messages on the computer keyboard, to texting, where I can't.) So messages tend to be short, of restricted vocabulary and complexity, and full of nasty abbreviations. But e-mails are simply typed letters that get delivered with much more speed than the mail can achieve. I will grant that you miss the tone-of-voice cues that can be heard over the phone, but I think that's often more than made up for by the ability to both speak and listen without interruption. On the phone, if I turn all my attention to what the other person is saying, there's a long silence when it's my turn to talk while I think of how I want to respond. But if I try to figure that out while the other person is speaking, I'm likely to miss, or mis-interpret what is said. And when I'm speaking, it's more than likely that I will get interrupted before getting out my entire thought, and the conversation will veer off in another direction, leaving my response incomplete and likely mis-understood. E-mail leaves plenty of time for listening, thinking, and responding.
Newport has serious problems with Facebook's "Like" button. I can see his point in some respects.
The “Like” feature evolved to become the foundation on which Facebook rebuilt itself from a fun amusement that people occasionally checked, to a digital slot machine that began to dominate its users’ time and attention. This button introduced a rich new stream of social approval indicators that arrive in an unpredictable fashion—creating an almost impossibly appealing impulse to keep checking your account. It also provided Facebook much more detailed information on your preferences, allowing their machine-learning algorithms to digest your humanity into statistical slivers that could then be mined to push you toward targeted ads and stickier content. (p. 192)
I do get the slot-machine analogy. We all crave (positive) feedback for whatever of ourselves we have put "out there." And the temptation to keep checking is real. It reminds me of the joke from 'way back in the America Online days, in which the person sitting at the computer (no smart phones back then) checks his mail, sees that there is none waiting for him—and immediately checks again. It was funny because that's what so many people did. But I think Newport misunderstands how many of us use the Like button.
In the context of this chapter, however, I don’t want to focus on the boon the “Like” button proved to be for social media companies. I want to instead focus on the harm it inflicted to our human need for real conversation. To click “Like,” within the precise definitions of information theory, is literally the least informative type of nontrivial communication, providing only a minimal one bit of information about the state of the sender (the person clicking the icon on a post) to the receiver (the person who published the post). Earlier, I cited extensive research that supports the claim that the human brain has evolved to process the flood of information generated by face-to-face interactions. To replace this rich flow with a single bit is the ultimate insult to our social processing machinery. (p. 153)
But here's the thing. I don't know anyone who pretends that clicking "Like" or "Love" or "I care" is conversation. However, it is the digital equivalent of one part of a successful conversation: the nod, the smile, the grunt, the frown, the short interjection, which in face-to-face conversation we used as an important lubricant to keep a conversation running smoothly. It hardly communicates any more information than the Facebook buttons; maybe it's little more than a bit—but it's an important bit. It says, "I'm listening, I hear you, I agree, keep talking," or "Wait, what you said confuses me, or angers me," or "I'm sorry, I sympathize."
As soon as easier communication technologies were introduced—text messages, emails—people seemed eager to abandon this time-tested method of conversation for lower-quality connections (Sherry Turkle calls this effect “phone phobia”). (p. 160)
Guilty as charged, but there's no need for Newport (or Turkle) to be snarky about it. I'm hardly alone, and there's ample evidence that phone phobia is attached to the same set of genes that makes me like mathematics. I love the (true) story a colleague told of a bunch of math grad students who decided to order pizza. Every one of them hemmed and hawed and delayed making the order, until the wife of one of the mathematicians, herself a grad student in philosophy, sighed, "For Pete's sake!" and called the restaurant. Text-based communication is a real boon to people like us. Call it a disability if you like—and then remember that you shouldn't mock or discriminate against people with disabilities.
Fortunately, there’s a simple practice that can help you sidestep these inconveniences and make it much easier to regularly enjoy rich phone conversations. I learned it from a technology executive in Silicon Valley who innovated a novel strategy for supporting high-quality interaction with friends and family: he tells them that he’s always available to talk on the phone at 5:30 p.m. on weekdays. There’s no need to schedule a conversation or let him know when you plan to call—just dial him up. As it turns out, 5:30 is when he begins his traffic-clogged commute home in the Bay Area. He decided at some point that he wanted to put this daily period of car confinement to good use, so he invented the 5:30 rule. The logistical simplicity of this system enables this executive to easily shift time-consuming, low-quality connections into higher-quality conversation. If you write him with a somewhat complicated question, he can reply, “I’d love to get into that. Call me at 5:30 any day you want.” Similarly, when I was visiting San Francisco a few years back and wanted to arrange a get-together, he replied that I could catch him on the phone any day at 5:30, and we could work out a plan. When he wants to catch up with someone he hasn’t spoken to in a while, he can send them a quick note saying, “I’d love to get up to speed on what’s going on in your life, call me at 5:30 sometime.” ... He hacked his schedule in such a way that eliminated most of the overhead related to conversation and therefore allowed him to easily serve his human need for rich interaction. (pp. 161-162)
I have to say, that strikes me as more selfish than clever. It's saying to everyone else that he will only communicate with them through his own preferred medium. Granted, it's his right to do so, and maybe he's learned that that's the best way he can get the most accomplished. But I'd have to be pretty desperate to call someone who I knew was going to be driving while he is talking with me. Either he's not going to be giving me his full attention, or he's not going to be giving the other cars on the road his full attention, neither one of which strikes me as ideal. And if I have a complicated question, I definitely want the response to be by written word, where there's a record of what was said, and more chance of getting a well thought out response.
I’ve seen several variations of this practice work well. Using a commute for phone conversations, like the executive introduced above, is a good idea if you follow a regular commuting schedule. It also transforms a potentially wasted part of your day into something meaningful. Coffee shop hours are also popular. In this variation, you pick some time each week during which you settle into a table at your favorite coffee shop with the newspaper or a good book. The reading, however, is just the backup activity. You spread the word among people you know that you’re always at the shop during these hours with the hope that you soon cultivate a rotating group of regulars that come hang out. ... You can also consider running these office hours once a week during happy hour at a favored bar. (pp. 162-163)
<Shudder> Really? I'm supposed to go to the expense, inconvenience, and annoyance of sitting around at a coffee shop or bar on spec, just hoping a friend shows up? And expect my friends to be willing to pay an insane amount for a cup of coffee just to talk with me? Here, and in many other places in Digital Minimalism, you can tell that Newport is an extrovert—with plenty of spare cash—and friends who are the same.
And anyway, whatever happened to visiting people in their homes? One friend of ours decided to quit Facebook, and in her final message invited anyone in town to drop by her house for tea. I could get into that. If you're willing to get out and drive to a restaurant, come instead and knock at our door. You'll be more than welcome and none one of us will have to buy an expensive drink. (This pandemic won't last forever.)
[In the early 20th century, Arnold Bennett, author of How to Live on 24 Hours a Day, speaking of leisure activities] argues that these hours should instead be put to use for demanding and virtuous leisure activities. Bennett, being an early twentieth-century British snob, suggests activities that center on reading difficult literature and rigorous self-reflection. In a representative passage, Bennett dismisses novels because they “never demand any appreciable mental application.” A good leisure pursuit, in Bennett’s calculus, should require more “mental strain” to enjoy (he recommends difficult poetry). (p. 175)
Newport approves of the idea that "the value you receive from a pursuit is often proportional to the energy invested." But then he adds,
For our twenty-first-century purposes, we can ignore the specific activities Bennett suggests. (p. 175)
And what, pray tell, is snobbish or unreasonable about literature and poetry?
Newport has a lot to say about the value of craft: of woodworking, or renovating a bathroom, or repairing a motorcycle, or knitting a sweater. He includes musical performances as well. But—and I find this odd for an author—he seems to have little respect for creating books. Would it be a more noble activity if they were typed on an old Remington, or handwritten? He similarly discounts composing music using a computer as less worthwhile than playing a guitar. I don't buy it.
The following story is for our two oldest grandsons, who have a way of picking up and enjoying construction skills.
[Pete's] welding odyssey began in 2005. At the time, he was building a custom home. ... The house was modern so Pete integrated some custom metalwork into his design plan, including a beautiful custom steel railing on the stairs.
The design seemed like a great idea until Pete received a quote from his metal contractor for the work: it was for $15,800, and Pete had budgeted only $4,000. “If this guy is billing out his metalworking time at $75.00 an hour, that’s a sign that I need to finally learn the craft myself,” Pete recalls thinking at the time. “How hard can it be?” In Pete’s hands, the answer turned out to be: not that hard.
Pete bought a grinder, a metal chop saw, a visor, heavy-duty gloves, and a 120-volt wire-feed flux core welder—which, as Pete explains, is by far the easiest welding device to learn. He then picked some simple projects, loaded up some YouTube videos, and got to work. Before long, Pete became a competent welder—not a master craftsman, but skilled enough to save himself tens of thousands of dollars in labor and parts. (As Pete explains it, he can’t craft a “curvaceous supercar,” but he could certainly weld up a “nice Mad-Max-style dune buggy.”) In addition to completing the railing for his custom home project (for much less than the $15,800 he was quoted), Pete went on to build a similar railing for a rooftop patio on a nearby home. He then started creating steel garden gates and unusual plant holders. He built a custom lumber rack for his pickup truck and fabricated a series of structural parts for straightening up old foundations and floors in the historic homes in his neighborhood. As Pete was writing his post on welding, a metal attachment bracket for his garage door opener broke. He easily fixed it. (pp. 194-195)
If you're wondering where to learn skills needed for simple projects ... the answer is easy. Almost every modern-day handyperson I've spoken to recommends the exact same source for quick how-to lessons: YouTube. (pp. 197-198, emphasis mine)
In the middle of a busy workday, or after a particularly trying morning of childcare, it’s tempting to crave the release of having nothing to do—whole blocks of time with no schedule, no expectations, and no activity beyond whatever seems to catch your attention in the moment. These decompression sessions have their place, but their rewards are muted, as they tend to devolve toward low-quality activities like mindless phone swiping and half-hearted binge-watching. ... Investing energy into something hard but worthwhile almost always returns much richer rewards. (p. 212)
Finally, I can't resist his description of former Kickstarter project called the Light Phone.
Here’s how it works. Let’s say you have a Light Phone, which is an elegant slab of white plastic about the size of two or three stacked credit cards. This phone has a keypad and a small number display. And that’s it. All it can do is receive and make telephone calls—about as far as you can get from a modern smartphone while still technically counting as a communication device.
Assume you’re leaving the house to run some errands, and you want freedom from constant attacks on your attention. You activate your Light Phone through a few taps on your normal smartphone. At this point, any calls to your normal phone number will be forwarded to your Light Phone. If you call someone from it, the call will show up as coming from your normal smartphone number as well. When you’re ready to put the Light Phone away, a few more taps turns off the forwarding. This is not a replacement for your smartphone, but instead an escape hatch that allows you to take long breaks from it. (p. 245).
Despite our areas of disagreement, there's only one really, really annoying section of the book. He spends seven pages on the ideas of someone named Jennifer who "prefers the pronoun 'they/their' to 'she/her'." The ideas are not worth the ensuing confusion between singular and plural. I found myself constantly re-reading trying to figure out who was being referenced in the text.
But I do recommend reading Digital Minimalism. The concept of solitude deprivation alone would make it worthwhile, and the rest of the book is pretty good, too—especially if you're not a phone-phobic, introverted author.
Permalink | Read 213 times | Comments (4)
Category Reviews: [first] [previous] Education: [first] [previous] Health: [first] [previous] Computing: [first] [previous] Children & Family Issues: [first] [previous] Social Media: [first] [previous]
Now there's a title I'll bet no one else has used.
All the disruption caused by the COVID-19 virus turns out to have a positive side for us prosopagnosiacs. Suddenly all meetings are taking place via ZOOM, GoToMeeting, or some similar vehicle.
Think of the boon this is to those who suffer from face blindness! Suddenly everyone in the room is labelled! There's the person's picture, and right below it his name! The system is only as good as the names people choose to associate with their photos—I bless those who give their full names, rather than just "Jim"—but the chaos has been somewhat tamed. I'm particularly enjoying it in our church "happy hours" where I am finally, albeit very slowly, beginning to associate names, faces, and voices.
(As I was writing this, it occurred to me to wonder why I am not in favor of wearing name tags in church. It's actually a very helpful practice for people like me. I guess I still have scars from a church where the pastor used what he called "motivation by embarrassment" to encourage the wearing of nametags. If Martin Luther thought Satan could be driven away through the use of mocking and scorn, let me just say that it is also an effective method of driving shy and sensitive people far away from your church.)
It's interesting to me the different ways people react to this video social interaction. I find it helpful, but my husband finds it frustrating. I experience less chaos than in real life, he—accustomed to strictly-regulated business meetings—feels more. I know my daughter feels more comfortable with Zoom meetings than her husband does; I wonder if there is a difference between introverts and extroverts here as well?
I also find that meetings are more manageable if I listen more than I talk. I probably could learn something from that.
Years ago, I read of the experiences of a volunteer who moved to an impoverished country in an effort to make a positive difference in the lives of its suffering people. His initial observations led him to conclude that the community was indolent. They had no ambition, and preferred sitting around and chatting to making any kind of effort to improve their situation.
After sharing their lives for a while, however, he realized that they were not so much lazy as malnourished and exhausted. Living under a blazing tropical sun, with a diet deficient in both quality and quantity, and no access to medical care, it's a wonder they managed as well as they did.
I thought of that story when I re-read "The Luxury of Feeling Good" at The Occasional CEO.
There exists in our modern world the presumption—or maybe better—the luxury of feeling good. Some combination of the right food, enough sleep, exercise, aspirin and flu shots, and access to real medical care when required have been foundational to my decades in the workforce. ... I know there are unfortunate people who suffer without relief, but most of my co-workers through the years have been able to function comfortably on a daily basis thanks to the many blessings of modern life, from coffee to cold packs to dentists to Tylenol, that keep us upright and productive. What makes the luxury of feeling good so special is that we are among the very first generations of humankind to expect each day to be pain-free and generally comfortable.
I'm at the age where I no longer take health for granted. Too many of my friends are dealing with broken bones, replaced joints, arthritis, and even strokes and cancer. I ache more than I'd like, and even getting out of bed reminds me that my muscles and joints don't work as well as they once did.
Did I say I don't take feeling good for granted? Actually, I do. Most of the time I don't even think about it, till suddenly something hurts, and I start moaning and whining. Here's a glimpse of what the high achievers of generations not that far back had to put up with:
[Eli] Whitney entered Yale with forty-two other freshmen and graduated four years later with only thirty-eight living classmates; if my undergrad class had suffered death at the same rate, we would have lost 133 students of 1,400. On break between school terms, Eli himself nearly died of an unspecified disease, what his sister called “Hypo.” A few years later he was struck down with malaria, the effects of which incapacitated him time and again throughout his life. Then, barely recovered, he headed to New Haven, Connecticut, to commence manufacturing and found the town awash in scarlet and yellow fevers so virulent that he could not employ a steady workforce.
Joshua Lawrence Chamberlain [commanded] Union troops at the Second Battle of Petersburg in June 1864. Chamberlain was shot through the right hip and groin, a wound so serious he was given a deathbed promotion and recorded as deceased in Maine newspapers. ... With peace, he served four terms as the Governor of Maine and went on to become president of Bowdoin College. Chamberlain practiced law in New York City. He pursued real estate interests in Florida and railroad interests on the West Coast. At age 70 he volunteered for duty in the Spanish-American War but was rejected. He died at age 85 due to complications from the wound suffered at Petersburg.
Joshua Chamberlain had a full, rich, active, successful career. Nothing seemed to slow him down. But we also know that from the moment of his Petersburg wound in 1864, he was forced to use some kind of primitive catheter and colostomy bag. He underwent six operations to try to correct his wound. He suffered pains, fevers, and infections throughout most of his life. One of my friends at Gettysburg said, "I think Chamberlain had a urinary tract infection for the last fifty years of his life."
Have you ever had a urinary tract infection for a day? Did it make you want to run for governor?
Keep in mind that these are people whose sufferings and accomplishments have been recorded. Let's not forget the everyday men, women, and children who raised crops and reared children, put dinner on the table, endured long journeys, and built cathedrals, all without aspirin, let alone antibiotics.
This means that the last few generations in America have been blessed with enormous advantage. It's not just that many of us get up in the morning and "pursue our passion" instead of having to plow the fields or milk the cows. It's not simply that we can get warm in the winter and stay cool and productive in the summer, or that we have clean water to drink and indoor plumbing. Perhaps our greatest single advantage over prior generations is the ability to work and live comfortably and pain-free.
If you're browsing the toothpaste aisle of your local grocery store, would you do a double-take upon seeing this prominently displayed?
That's what happened to me several years ago when shopping in Switzerland. To this day I smile whenever I see it on a visit to Migros or Coop. It is a prime example of the need for companies to take care when exporting their products to other countries. Perhaps the best-known example is selling the Chevy Nova in Spanish-speaking countries: General Motors certainly didn't want prospective buyers to be thinking "doesn't go" with respect to their cars.
If the Swiss company that makes Candida toothpaste exports their product to English-speaking countries, I doubt it is under the same name. The thought of brushing my teeth with something that suggests a vaginal yeast infection does not inspire me to put this in my shopping cart. It is not much better to be reminded of thrush, a candida infection of the mouth.
I don't know what the makers of Candida were thinking when they chose that name, but it turns out that it's not as crazy as it sounds. Although this toothpaste appears to me to be marketed simply as a good dentifrice, there have been studies showing that certain toothpastes are effective in fighting oral candida infections. Here's a study that compared nine brands of herbal and conventional toothpaste (unfortunately, Candida was not among them) and concluded,
All toothpastes studied in our experiments were effective in inhibiting the growth of all C. albicans isolates. The highest anticandidal activity was obtained from toothpaste that containing both herbal extracts and sodium fluoride as active ingredients, while the lowest activity was obtained from toothpaste containing sodium monofluorophosphate as an active ingredient.
Now you know. Maybe the Swiss are onto something.
About once a year or so we actually go out to a theater and watch a movie. I knew I wanted to see Unplanned, and did not have any confidence that it would eventually make it to Netflix. So Porter bought tickets online for our local AMC theater, and we made a date of it.
"Date" is an appropriate word, because despite the seriousness of the subject and a couple of horrifying scenes that probably earned it its "R" rating, Unplanned is basically a love story: The unconditional love of parents for a child who has made lifestyle choices in complete opposition to their own deeply-held values; the steadfast love of a man in support of his wife despite his conviction that her chosen career path is an immoral one; the love that leads us to embrace our common humanity in the face of chasmic differences; and the relentless love of God for his hurting world—"unresting, unhasting, and silent as light."
Abby Johnson's desire to make a difference in the world, to support the rights of women, and to help women in crisis situations led her, beginning as a student volunteer at the local Planned Parenthood clinic, to a promising career with that organization. She became one of the youngest-ever clinic directors, and won an Employee of the Year award in 2008.
And then that same heart-felt desire to help women led her to quit. Unplanned is her story.
The story is well told. The movie is beautiful—except of course where it's ugly. I particularly like the fact that it is not a black-and-white, one-dimensional story of a sudden conversion, despite the "what she saw changed everything" subtitle. As much as can be done in a movie less than two hours in length, we see Abby's growth through time and experience. Her change of heart seems more of a tipping point than a crisis, though there are certainly elements of the latter as well. Abby at the end of the movie is more knowledgeable, more experienced, certainly less naïve, and moving in a different direction in more than one area of her life—but still Abby.
The only fault I find is the portrayal of Abby's boss, who is indeed one-dimensional; we never see her human side. It reminds me of what C. S. Lewis said about George MacDonald, that he was rare among authors in being able to portray good much better than evil: "His saints live; his villains are stagey." It's certainly possible that this woman was as nasty as she seems, and as I said, it's a short movie, but I would like to have seen something redeeming about her character.
Do I recommend seeing Unplanned when you have the chance? Absolutely, 100%, a hundred times over. Do I recommend it for our grandchildren? Eventually. They're all under age for the rating at this point, anyway. Maybe the oldest one or two could handle it well, if their parents watch the film first and agree. Anyone younger than that would be traumatized, maybe scarred for life—if they understood it at all. At first I wondered about the R rating, given the horrible things I've seen in PG-13 movies, but I believe the MPAA got it right in this case. Unplanned is a beautiful movie, and an important one, but there's no denying that it's disturbing in a way no child should be asked to handle. Not that so many kids haven't already seen worse. And it's rather bizarre to require parental consent for a child watch a movie with a few abortion scenes, when that same child could actually have an abortion without it.
It took a long time for me to dip my toes into the DNA testing waters, being both an avid genealogist and a very private person. But just as giving birth changed my relationship to modesty, starting a blog changed my relationship to privacy. I'm still both modest and private, but not in the same way. The biggest obstacle to DNA testing was knowing I was dragging my family along. As recent events have shown, criminal behavior (and other indiscretions) can be found out by DNA through relatives' information available on genealogy websites.
But I discovered long ago that privacy as we knew it is dead. I remember working with a family researcher who was writing a book on one side of our family. At one time, I would have refused to contribute any information, but had since been helped so much in my research by a book on the Wightman Family that I wanted to help others the same way. The Wightman book, incidentally, has information on me and our family that was contributed without my knowledge or consent. At the time I was not happy, but I got over that and now appreciate it. Except for where the data is wrong....
The point, however, is that while such direct contributions help researchers, they're not all that necessary. When one of my family members declined to contribute his family's information to the project I was helping with, the researcher understood his reluctance—but he added, "Let me show you the information I've already obtained from public sources." He already had just about everything he could use. As Illya Kuryakin Dr. Mallard said on NCIS last night, The Internet will be the death of us. Or at least of privacy.
In light of all this, Porter and I each decided to submit a sample to AncestryDNA.com, and eagerly awaited the results. Later we uploaded the DNA data to MyHeritage.com, and eventually gave another sample to 23andMe.com—the latter for both the ancestry and the health screening.
This post is not for a detailed analysis of the results, but an overall impression of the value of the DNA testing. First, from the point of view of genealogy.
For us, the Ancestry.com screening was the most useful. This is for two reasons.
- They have the largest database from which to work, and that is what makes the testing useful—comparing your DNA to that of other populations. For this reason it is also most useful for those of European background, because of the large numbers of that population who have participated. The testing services are working to improve the experience for under-represented populations, but for now the data is not so robust.
- I have uploaded our family tree, with its nearly 15,000 individuals, to Ancestry.com, and that's largely what makes their DNA service helpful for genealogy. This gives context to our DNA matches, and I've already confirmed known relatives while learning of several more. My tree is at the moment private on Ancestry, which means people have to ask me about the information, which is a good way to get to meet them. Someday I will make it, or at least a version of it, public, but the tree itself isn't ready for that exposure yet.
No doubt MyHeritage would be more useful if I put a tree up there as well, but that's on the "Someday/Maybe" list. I only uploaded our data because at the time they gave free access to their resources if you did. So far they've only found us "third-to-fifth cousins"—tons of them—which is not of much use without trees to compare, and most people seem to have no trees or very small ones. Third cousins share a great-great-grandfather, so it requires a significant amount of family history knowledge to make the connection.
23andMe is in the same situation as far as genealogy goes. So far nothing found even as close as second cousin (sharing a great-grandfather).
How has this helped my genealogy research? Well, through Ancestry.com I've connected with a few previously unknown cousins, a couple close enough to be useful in sharing information. Even the ones that are more distant have been useful in providing some confirmation of my research. Overall I'm glad I took the plunge, if only for this reason. It also has a lot of potential for more and better information as time goes on. One important caveat: There is a lot of error in online family trees. Even with DNA support, this information is best taken as inspiration for further research, and for mutual sharing of data sources.
Now for what most people want out of DNA testing: heritage and ethnicity information. This is an estimate only, and each company has its own data and algorithm for making its "best guess." Sometime after we had our samples analyzed, Ancestry.com upgraded their system and re-analyzed our data. The results were not terribly much different from the first attempt, though probably more accurate.
The analysis from MyHeritage was closer to Ancestry's original analysis. That from 23andMe was different from any of the others, though quite similar overall.
My impression? The DNA analysis is very good as an overall picture, not so good on the details. For example, Porter's great-grandparents came to the United States from Sweden, and it is well known where they lived before emigrating. In fact, when his dad visited Sweden, he was told he looks just like people who live in that area. Thus when his father's AncestryDNA analysis came back showing his largest ethnicity to be Norwegian, we were taken aback. However, the area he's from may be called Sweden, but it's right on the border with Norway. One can definitely say from his DNA that he is of Scandinavian origin, but that he is specifically Swedish comes from genealogy. One must also remember that the smaller percentages are suspect: of the three analyses, 23andMe was the only one that gave me "broadly East Asian and Native American" ancestry, and that was at just 0.1%, so highly doubtful.
Finally, there's the analysis of genetic health data. This comes primarily from 23andMe, though we also paid an extra $10 post facto for Ancestry's "Traits" screening. I've written about the latter experience before. 23andMe analyzes many more traits than Ancestry's small sample, from "Leigh Syndrome, French Canadian Type" carrier status, to estimated risk for late-onset Alzheimer's Disease, to Lactose Intolerance, to Asparagus Odor Detection.
My thoughts? Interesting, but not quite ready for prime time. Where I have independent data it sometimes confirms, sometimes contradicts the DNA reports. Ancestry says I likely have a "unibrow" but 23andMe says the opposite. Both of them say I probably hate cilantro, and I love it. And so on. So I'm taking the rest of what they say with a few grains of salt. I'm sure there's something to it, and that the data will get better with time, but for now it is more entertainment than useful information. Actually, I take that back: Just as DNA ancestry data is useful as a starting point for further research, the discovery of certain traits might be useful for suggesting further, medical, genetic testing.
There's a lot more to DNA analysis for the serious genealogy researcher to investigate, such as sites that will take your data and give you tools to learn much more about which particular genes you and a DNA match share. I'm not there yet; I have too much to do with my regular research to explore that path further. But it, and my data, are there when I'm ready.
Am I glad I decided to "spit in the tube"? Absolutely; I'd do it again and may later go further with it. I'm very grateful to family members who have taken the plunge as well, because that provides a look at the puzzle from more angles. But it's always important not to expect too much. It's never as simple as trading your kilt for lederhosen, as the Ancestry.com ad blithely shows. Plus there's a risk of finding out things you don't want to know—about family or about health. It's a very personal decision and I understand those who are reluctant to take the risk.
March 21 is World Down Syndrome Day.
Temple Grandin wrote:
It is likely that genius is an abnormality. If the genes that cause autism and other disorders such as manic-depression were eliminated, the world might be left to boring conformists with few creative ideas.
Down Syndrome is not genius, at least not in the intellectual sense. If I could wave my hand and eliminate that third copy of the 21st chromosome, I imagine I would do so. But would that be a good thing? The more I hear from families of children with Down Syndrome, the more I wonder if these people have something important to offer the world that shouldn't be thrown away.
Even if eliminating the genetic defect that results in Down Syndrome would be best for all concerned, I know for a fact that eugenics is not the right way to effect a cure.
The population of people with Down Syndrome is diminishing rapidly, not because someone has cured the condition, nor found a way to prevent its occurrence, but simply because more and more babies with Down Syndrome are killed before they have a chance to be born. Prenatal testing to determine the presence of that extra chromosome is widespread, and more and more parents are opting for abortion rather than meet this challenge.
It's not my place, here, to judge another person's response to a difficulty I have never faced. But as a society we need to be aware of exactly what we are doing. There have been other times in our history when we have made deliberate efforts to eradicate the "unfit," and those actions have been rightly condemned by subsequent generations.
I respect doctors, and am grateful for their skills, knowledge, and compassion. But that respect and gratitude are much the same as my feelings about teachers: individually and personally they can be fantastic, but as a bureaucracy (the medical/educational "establishment") I have serious doubts.
In my own life, the medical establishment's attack on my health began at birth. I don't know the details of my hospital birth, but I know the official policies were long on interference and very disrespectful of the natural birth process. What I know for certain was that my mother was discouraged from breastfeeding and told to feed me "formula," which in those days was a mixture of water, evaporated milk, and Karo corn syrup. (You read that right.)
Somehow i survived that abomination of an infant diet, which also included introducing solid foods at a few weeks of age. But it didn't stop there: I grew up right in the middle of the big push to get people to eat margarine instead of butter. My parents followed that recommendation, too—probably quite willingly, because margarine was so much cheaper than butter. I don't blame them for that, but I do blame the medical establishment for pushing margarine as far healthier than butter. Of course they now tell us just the opposite. Several years ago I made the switch back to butter, but not before exposing our own children to far too much margarine in their diets.
When I was young, my family also switched from drinking whole milk (delivered in glass bottles, with the cream risen to the top) to the skimmed variety, again at the recommendation of the doctors. That one stuck with me—to this day I prefer skim milk, and with skim milk we fed our children. But it would probably have been better if I had never lost my taste for milk with its full complement of fat and natural vitamins. Even the doctors no longer recommend skim milk, though they're still pushing less than the full 4% butterfat version.
I've lived long enough to see doctors insist that all babies must sleep on their backs, then that all babies must sleep on their stomachs, then back to their backs, then their sides...with never an apology for giving the "wrong" advice for so many years. I'm glad that my knowledge of official fickleness enabled me to stand firm in my own decision not to let flip-flopping doctors determine how our babies would sleep. At least we got that one right.
Now there are indications that the intense campaign to come between American skin and the light of the sun is causing problems much more severe and widespread than the skin cancer it's supposedly preventing. The push to slather sunscreen on every time we leave the house has resulted in widespread vitamin D deficiency, and a re-emergence of the bone disorder, rickets. Moreover, the article, Is Sunscreen the New Margarine?, makes the case that sun exposure is necessary for our cardiovascular health, especially for healthy blood pressure levels. Many doctors are now saying that we need to ease up on the sun-phobia, though it's still controversial.
One of the leaders of this rebellion is a mild-mannered dermatologist at the University of Edinburgh named Richard Weller. For years, Weller swallowed the party line about the destructive nature of the sun’s rays. “I’m not by nature a rebel,” he insisted when I called him up this fall. “I was always the good boy that toed the line at school. This pathway is one which came from following the data rather than a desire to overturn apple carts.”
Weller’s doubts began around 2010, when he was researching nitric oxide, a molecule produced in the body that dilates blood vessels and lowers blood pressure. He discovered a previously unknown biological pathway by which the skin uses sunlight to make nitric oxide.
It was already well established that rates of high blood pressure, heart disease, stroke, and overall mortality all rise the farther you get from the sunny equator, and they all rise in the darker months. Weller put two and two together and had what he calls his “eureka moment”: Could exposing skin to sunlight lower blood pressure?
Sure enough, when he exposed volunteers to the equivalent of 30 minutes of summer sunlight without sunscreen, their nitric oxide levels went up and their blood pressure went down. Because of its connection to heart disease and strokes, blood pressure is the leading cause of premature death and disease in the world, and the reduction was of a magnitude large enough to prevent millions of deaths on a global level.
Other studies have found more benefits of sun exposure.
Pelle Lindqvist, a senior research fellow in obstetrics and gynecology at Sweden’s Karolinska Institute... tracked the sunbathing habits of nearly 30,000 women in Sweden over 20 years. Originally, he was studying blood clots, which he found occurred less frequently in women who spent more time in the sun—and less frequently during the summer. Lindqvist looked at diabetes next. Sure enough, the sun worshippers had much lower rates. Melanoma? True, the sun worshippers had a higher incidence of it—but they were eight times less likely to die from it.
So Lindqvist decided to look at overall mortality rates, and the results were shocking. Over the 20 years of the study, sun avoiders were twice as likely to die as sun worshippers.
On the other hand,
“I don’t argue with their data,” says David Fisher, chair of the dermatology department at Massachusetts General Hospital. “But I do disagree with the implications.” The risks of skin cancer, he believes, far outweigh the benefits of sun exposure. “Somebody might take these conclusions to mean that the skin-cancer risk is worth it to lower all-cause mortality or to get a benefit in blood pressure,” he says. “I strongly disagree with that." It is not worth it, he says, unless all other options for lowering blood pressure are exhausted. Instead he recommends vitamin D pills and hypertension drugs as safer approaches.
Seriously? Vitamin D supplements have been shown to be ineffective, probably because there's more to the benefits of sun exposure than the vitamin. And can he honestly believe that exposure to the risks of hypertension drugs is better than a little sunshine? I generally take with a grain of salt the blanket pronouncements of some of my more radical friends that the medical industry has no interest in anything they can't make money from. In most of life I'm inclined to attribute bad effects more to ignorance than to evil intent. However, sometimes that optimism is shaken.
Me? I live in Florida. I know the power of the sun, and am grateful for sunscreen when I deem it necessary. All the doctors agree that sunburn is bad. But even in Florida I've always known that some sun exposure is important—another thing I think we got right with our kids.
I still feel guilty about the margarine, though.
Go Wild: Free Your Body and Mind from the Afflictions of Civilization by John J. Ratey and Richard manning (Little, Brown and Company, 2014)
I've neither the time nor the inclination for a full review of Go Wild, which I borrowed from the library while waiting for them to acquire Spark, another book by John Ratey, which was highly recommended by a friend. Fortunately, the friend said about Go Wild that she found it good but not worth paying for, so I'm still looking forward to Spark. I found Go Wild too annoying to call "good," but I am glad I read it, as there's a reasonable amount of inspiring information in it.
To begin with, the author pushes several wrong buttons for me, from the trivial to the overwhelming. As an example of the former, there's this (emphasis mine):
Even the child's song knows that the leg bone is connected to the thigh bone; we mean to press this idea a lot further to provide some appreciation of the enormous complexity and interconnectedness of the various elements of human life.
I'm sure he's referring to the spiritual, Dem Bones, which is not a child's song, even if it might end up in a collection of songs intended for children. And I know there are different versions, as there always are with songs of the people, but all the versions I've found acknowledge that the thigh bone is connected to the knee and the hip, not the "leg bone" (or "shin bone" as I know it). Yes, it's trivial—but to me it points to carelessness on the part of the author, which doesn't increase my confidence in what he says. (Or maybe I should blame his proofreaders.) There are other occasions where I get the same feeling.
Then there's this, which to me undercuts all his arguments: I'm fine with evolution as a scientific theory of origins and change. I'd go so far as to say it does an excellent job of explaining much of the available data. But I am not okay with evolution personified and deified, which is what happens in this book. All over, everywhere: "Evolution endowed," "evolution created," "evolution designed." Not only is evolution the basis for all the book's arguments, but the language makes evolution seem like a living, sentient, personal entity—though not, the authors are careful to point out, a loving one.
I was late in coming to the appreciation of religion, but I've always loved science. The religion of science horrifies me, however, and with that this book abounds. Add to that a significant dose of Eastern spirituality, and the feeling that the authors have been, perhaps, a little too selective in the studies they choose to believe—well, I wasn't too happy with the book.
It's also hard to take too seriously someone who—although he loves the outdoors and runs ultramarathons, will also drive 45 minutes to find a gym in the middle of nowhere.
That said, it's almost amazing that I found much of value here, but I did.
The authors cover a lot of ground. Here's a brief summary, although it doesn't come close to doing the ideas justice.
- Do what works for you. There is no one-size-fits-all. Take the first step in any of the areas they recommend changing, and you will find yourself gradually taking on more and more.
- Don't eat sugar in any form.
- Eat no wheat, rice, oats, or any other grain, not even in whole-grain form. No high-carb vegetables like sweet potatoes. No manufactured fats, no processed food, no fast food.
- Eat eggs, grass-finished beef, cold-water fish, nuts, simple fresh fruits and vegetables—but no fruit juices.
- Variety is important—as long as you avoid the long list of don'ts.
- Find a form of exercise you like, and do it.
- Exercise that invovles a variety of movements, the whole body, and lots of variation is best.
- Exercise is better out in nature.
- Exercise is better with other people.
- Get more sleep. If you live in 21st century America, it's guaranteed you're not getting enough sleep.
- Sleeping in the same room with the rest of your family is more healthful. (And we thought better sleep at the Maggie P. was due to the salt air.)
- Don't make your babies sleep alone.
- Soothing sounds, such as a crackling fire, or trusted adults moving around and talking quietly, lead to more satisfying sleep.
- Sleep doesn't have to happen all at once. Naps are fine. If you find yourself lying awake in the middle of the night, don't fight it, but get up and do something. Go back to bed later.
- The authors clearly admire Eastern spirituality, and thus promote the practice of meditation. But what they are trying to replicate is the relaxed hyper-awareness common among hunter-gatherer peoples, an ability to calm the brain of distractions while being alert—even more alert than otherwise. This turns out to be good for both brain and body health.
- Being out in nature is enormously healthful. Even an indoor potted plant helps.
- We need other people. We need our own "tribe."
- I wish he had dealt with the differences between introverts and extroverts in this section. We all need people, but they way we need each other is very different for the different personality types, and the authors appear to consider only the extrovert point of view.
As usual, this started out as the place to record a few interesting quotations, and ended up being a long review after all, though my summary did peter out at the end. There's a lot to think about here. I steadfastly reject the authors' extremes: for example, when it comes to food I am an omnivore by inclination but even more by principle, and I would no more adopt this no-carb regimen than I would go vegetarian. At the same time, it's good to eat a lot of vegetables, and it's also good to reduce our intake of carbohydrates, at least of the empty variety. I won't become a marathon runner, much less tackle an ultramarathon—but the book's thesis on the importance of movement is not only convincing, but provides inspiration to do things I've known for a long time that I should be doing.
Here are the random quotes:
Cows evolved to eat grass, but mostly we no longer feed them grass; we feed them the corn and soybeans that are the prime products of our industrial agriculture system. The practice of fattening beef in feedlots and the preponderance of factory beef in the fast-food system passes this omega-3 shortage into our bodies. ... [T]his is also why eating red meat itself has gotten a bad rap, with endless strings of studies linking it to heart disease and a variety of other issues. The beef that is the basis of these conclusions is factory beef, and no wonder.
Although I agree with the authors' complaint that the studies were made with the wrong kind of beef, they provide no evidence that beef from grass-fed cows does not have the same bad effects. I suspect that to be the case, but a citation of some evidence would have been nice.
[W]e begin to understand why social sleeping seems to be a nearly universal characteristic of cultures.... While we are sleeping, we continue to monitor our surroundings for cues of safety: relaxed conversation, relaxed movement of others, popping fire. Those cues, subtle sounds signaling safely, tell us we can retreat to our deepest sleep.
Many cultures are, in fact, conscious of all of this and the importance of these arrangements, and no place is the importance more pronounced than in the case of infants. ... All of this helps explain ... an almost universal perplexed response among most other cultures upon hearing of the Western practice of making babies sleep alone. "They think of this as child abuse. They literally do."
A very recent paper correlates an increase in the incidence of autism with receiving Pitocin during delivery. [Neurobiologist Sue Carter] says that Pitocin is routinely administered to delivering mothers in, she estimates, 90 percent of cases, although there are some signs that this practice is waning.
Why does aggression persist beyond reasons for it? Why are we so riven with senseless killing and warfare?
I picked up on that last one just because it highlights the central problem for people who have no sense of the reality of sin, only of its consequences.
The vagus nerve links up all the tools we need to respond to an existential threat, and so the vagal brake is a signal sent through the system for everything to stand down and engage—at ease. ... There is a simple measure of this. It can be read in the tension or lack of tension in facial muscles, heard in voice timbre and edge, and counted in rate of respiration. ... There is such a thing as vagal tone, completely analogous to muscle tone—and the tone shows how clear and distinct a given individual's ability to apply the brake is.
The vagal brake can be driven by breath, a clear connection readable as blips on a chart. You are in control of your breath, to some degree. Thus, this is not simply a point for measuring or sensing arousal; it is a point for controlling arousal and, downstream, the health problems that stem from lack of control.
If you force yourself to smile, the specific spots in the brain that register depression suddenly say your depression is better. ... It turns out that a halfway, forced smile won't do the trick, because it won't light up the neurons of increased happiness in your brain. But if that forced smile goes so far as to engage the little muscles in the corners of your eyes—that is, if you do what socially adept people understand instinctively—these neurons do indeed light up. And the muscles at the corners of your eyes are within the reach of the vagus nerve.
[The breath] exerts control through the alarm system that is the autonomic nervous system. [Researcher Stephen Porges] says he realized a long time ago—because he is a musician, specifically a horn player—that the act of controlling the breath to control the rhythm of music and at the same time engaging the brain to execute the mechanics of music works like a mental therapy. To his mind, it has all the elements of pranayama yoga, a form of yoga that stresses breath control.
The act of controlling the breath has a parallel brain response of calming our instincts for fear and danger. It's easy enough to see this in deliberate practices like yoga, but the same idea applies in many more time-honored practices: choral singing, Gregorian chants, even social music like bluegrass or blues derived from the chants and work songs that African slaves developed to help them tolerate oppression.
Music or evidence of music appeared fifty thousand years ago in that sudden flourish of evidence of cultural evolution that defined humans as humans—and ever since, music has loomed as a cultural universal. All known cultures and people make music. Yet all of this also suggests that we lose something when the crane's leg bone gets replaced by an iPod. We lose the benefits of sitting in a circle of fellow humans and driving the breath and beat that drives the music. [Emphasis mine]
As my friend said, Go Wild is worth reading—but not worth buying. If what I can only describe as bizarre spirituality—bizarre for a book that claims to be scientific—doesn't bother you, and if you can overlook the extremities, which are at their worst in the section on food, there are a number of interesting and worthwhile points.
My brother used to tell me that drinking orange juice was no better than drinking Coke, as it was no better than sweetened water.
Being a Floridian, that has rankled ever since.
It was brought to mind recently in a discussion with my nephew, the medical student, in which I heard him say that the recommendation for drinking juice was no more than two or three times a week. I may have heard the details wrong, because I don't see that when I look online for official recommendations, which are a bit more generous. Or it may be the newest medical-school thinking that hasn't yet been set in stone. But the upshot of the discussion was that whole fruits are good for you and should be encouraged, while fruit juice is bad for you, with no real benefits, and should be severely restricted. This opinion piece in the New York Times is an example of the bad rap juice is getting.
The doctors have good intentions, but I wouldn't be surprised if the real impetus behind this negative attitude towards juice comes from those who want to push soda consumption. After all, if orange juice isn't any better than Coke, why not drink Coke for breakfast, as the granddaughter of an acquaintance used to do?
The real question is: Why is juice so radically different from the whole fruit from which it is (supposedly) made, that the recommendations for consumption are polar opposites?
My answer is that what is called juice these days may have started as fruit, but has been so processed—strained, filtered, heated, added to and subtracted from, torn apart and put (somewhat) back together—that its source is no longer recognizable. Consider the following products:
- Oranges, freshly-picked from the tree, and reamed to extract the juice and much of the flesh
- Fresh orange juice that has not been pasteurized (I can buy this at local specialty stores, and also at Costco!)
- "Not from Concentrate" orange juice from the grocery store, which has been processed and pasteurized but at least looks like orange juice because it includes pulp
- #3 but without any pulp
- #3 or #4 with calcium added
- Orange juice from concentrate (John McPhee's book, Oranges, has a graphic description of what happens in that process)
- Orange juice drink, orange drink, orange-flavored drink, and other designations of something that may or may not have some real orange juice in it
- Tang and other pseudo-orange beverage mixes
The legal definitions are fuzzy—it's amazing what you can do to a product and still call it "orange juice"—and doctors rightly draw a line between #6 and #7, but say "orange juice" to the general public, and you could evoke thoughts of any of the above.
As far as I'm concerned, the list is in decreasing order of flavor. I suspect it is also in decreasing order of nutrition. But this definition of "juice" is so broad, even if you exclude #7 and #8, that it's useless. What do the doctors mean when they say "fruit is good, juice is bad"? Are they even considering how slippery the definition is?
This is orange juice.
It is juice I squeezed from oranges Porter picked from our own Page orange tree. Technically, the above statement is incorrect, because the Page orange is not a true orange, but a hybrid developed in Orlando in the 1940's that is 3/4 tangerine and 1/4 grapefruit. I should have said, This is citrus juice. I have no idea what the Food and Drug Administration would call it. I call it delicious.
Drinking this juice is not the same thing as eating the fruit, I'll grant. Some of the membranes are left behind in the juicing process. But a lot gets through, as you can see in this picture of the juice before I shook the bottle.
I'd say the experience is pretty close to eating the fruit. I acknowledge that the experience of drinking processed, grocery-store juice is radically different from that of eating fruit. However, the problem is not in the juice. The problem is in the processing, and the labelling.
Don't fight to eliminate juice. Fight to bring back real food!
The infamous Blue Screen of Death is all too familiar to my generation of Windows users. It may be that blue screens are now causing death in a different way.
This Popular Science article reports that prolonged exposure to blue light can cause irreversible damage to the cells that allow us to see. (And truly, I thought of the Blue Screen of Death analogy before I noticed that the article's author did, too.) That would be light from our televisions, computers, phones, e-readers, and even increasingly popular LED illumination.
Catastrophic damage to your vision is hardly guaranteed. But the experiment shows that blue light can kill photoreceptor cells. Murdering enough of them can lead to macular degeneration, an incurable disease that blurs or even eliminates vision.
Blue light occurs naturally in sunlight, which also contains other forms of visible light and ultraviolet and infrared rays. But ... we don’t spend that much time staring at the sun. As kids, most of us were taught it would fry our eyes. Digital devices, however, pose a bigger threat. The average American spends almost 11 hours a day in front of some type of screen, according to a 2016 Nielsen poll. Right now, reading this, you’re probably mainlining blue light.
Obviously, more research is needed before we panic about this. But maybe it's time I stopped putting myself to sleep by reading on my Kindle, or playing a move or two in Word Chums, or praying through our church's Prayer Chain list. They say you should turn off "devices" an hour before bedtime, because the blue light can keep you from falling asleep. That's never been an issue for me. But damaging my eyes? That's a much bigger issue.
So, a handful of people have gotten sick recently from eating salmonella-contaminated eggs from a farm in North Carolina. Salmonella, of course, can be a serious infection and is certainly not one even a healthy person wants to encounter. But who is writing the advice we are being given on how to handle these eggs should we be unfortunate enough to find them in our refrigerator?
Do not eat, serve, or sell these eggs; throw them away or return them for a refund, and be sure to disinfect the shelf on which they were stored.
Really? That kind of overreaction can only have been designed by hyper-sensitive doctors under the advice of their lawyers and malpractice-insurance companies. Why not just hard-boil the eggs? If you cook them until the white and yolk are both hard, you've killed the salmonella bacteria. Maybe I'd give them a couple of extra minutes, just because I can be a little paranoid that way.
And unless you're crazy enough to take your eggs out of the handy carton they come in and store them directly on your refrigerator shelf, I can't imagine why a shelf would need to be especially sanitized.
But hey, what do I know? I'm not a doctor, a biologist, a lawyer, an insurance company executive, or even a helicopter grandparent, so don't take this as advice.
Take it as yet another sign that common sense has been thrown out the window, and scare tactics rule the day—making us more and more inclined to miss the signal of an important warning amidst the noise of constant overreaction. Aesop warned over 2500 years ago of the dangers of crying "wolf."
Warning: sex stereotyping ahead. It's supposed to be funny, folks; don't take it too seriously.
How can you tell that men, not women, designed the birth control pill? Simple. I figured it out after reading Malcolm Gladwell's What the Dog Saw, in which he comments that it is not biologically necessary that birth control pills have an "off" week to induce menstruation; it was part of the design so that the woman's cycle would be more normal. But what is "normal" about menstruating every month? Young girls don't, older women don't, some top athletes don't, and more importantly, women who are pregnant or intensely breastfeeding usually don't, either. Here's the scenario as I see it:
Male researchers Let's see. Women who are pregnant don't ovulate, so if we manipulate a woman's hormones so that we mimic pregnancy, she won't ovulate, and can't get pregnant. This means we could have sex whenever we feel like it, without any sacrifice on our part, leaving the entire responsibility on women for whether or not they get pregnant. Yee-haw! But we won't really mimic pregnancy, in which a woman doesn't menstruate for at least nine months and sometimes two years or more, because, well, because it's natural for a woman to menstruate every 28 days.
Female researchers Let's see. Women don't menstruate while pregnant, and often don't while lactating, so if we manipulate a woman's hormones so that we mimic pregnancy, she need only menstruate once every year or two. Yee-haw! This means could go two years without experiencing the mood swings, intense pain, and mess? Bring it on! Wait, you say we ought to design this pill so that the fake pregnancy miscarries every 28 days? You must be C-R-A-Z-Y!
It was an irresistable headline: Nutritionist claims pizza can be a healthier breakfast than cereal.
I love breakfast. I could eat it for breakfast, lunch, and dinner. My current favorite morning meal is a large bowl of steaming oatmeal with dried fruit, though that may change with the weather.
Make that second-favorite. Pizza is always at the top of the list.
Blogger and dietitian Chelsey Amer caused a stir when [she announced] that a greasy slice of pizza is healthier than a bowl of cereal with milk. "You may be surprised to find out that an average slice of pizza and a bowl of cereal with whole milk contain nearly the same amount of calories,” Amer said. “However, pizza packs a much larger protein punch, which will keep you full and boost satiety throughout the morning."
Not that this is news to me, though it's nice to hear a nutritionist say it. The writer of the article, however, is less than enthusiastic, and spends most of his effort convincing us of ways to make cereal healthier.
New York-based dietitian Keri Gans says that cereal can be a perfectly healthy breakfast option — yes, healthier than pizza — as long as you’re smart about it. ... "If you choose the right cereal that’s packed with fiber, it may help lower cholesterol and control blood sugar. ... You could top your cereal with berries, which are rich in vitamins. ... you [can] work plenty of nutrition into your bowl — far more than you’d find on a dollar slice."
Well, sure, if you want to load the equation in favor of cereal. But you can do the same thing for the pizza. Skip the fast food version. Homemade pizza, whole-grain crust, good tomato sauce and cheese, lots of veggies.... But don't forget the pepperoni, if—like me—you consider it nearly essential to good pizza. Don't skimp on flavor, or it won't be satisfying and you'll eat more.
People tell me they couldn't move to Florida because they can't stand our bugs. Me, I'll take our giant cockroaches any day over ticks.
I grew up in Upstate New York. I spent much of my free time in the woods near our house, and hiked with my father all over the Adirondack Mountains. Never in my life did I see a tick of any sort until a visit to Connecticut after I graduated from college. Now, apparently, ticks are everywhere in the Northeast (and more). The worst a roach ever did to me was to scuttle into my bra when I was prone on the floor searching under the kitchen cupboards. The worst a tick has done to me was to give my little grandson Lyme disease, a far more serious, and much less amusing, situation.
Ticks freak me out. I don't know where this infestation came from, and I'm not happy about it.
But just when I started thinking that "extinction is forever" would be a great idea for all tick species, I read this: Oxford University researchers say ticks are a "gold mine" for new drugs.
It's possible that the extinction of any species, even the most apparently useless, annoying, or even dangerous, deprives us of some great, as yet undiscovered, benefit.