Now there's a title I'll bet no one else has used.
All the disruption caused by the COVID-19 virus turns out to have a positive side for us prosopagnosiacs. Suddenly all meetings are taking place via ZOOM, GoToMeeting, or some similar vehicle.
Think of the boon this is to those who suffer from face blindness! Suddenly everyone in the room is labelled! There's the person's picture, and right below it his name! The system is only as good as the names people choose to associate with their photos—I bless those who give their full names, rather than just "Jim"—but the chaos has been somewhat tamed. I'm particularly enjoying it in our church "happy hours" where I am finally, albeit very slowly, beginning to associate names, faces, and voices.
(As I was writing this, it occurred to me to wonder why I am not in favor of wearing name tags in church. It's actually a very helpful practice for people like me. I guess I still have scars from a church where the pastor used what he called "motivation by embarrassment" to encourage the wearing of nametags. If Martin Luther thought Satan could be driven away through the use of mocking and scorn, let me just say that it is also an effective method of driving shy and sensitive people far away from your church.)
It's interesting to me the different ways people react to this video social interaction. I find it helpful, but my husband finds it frustrating. I experience less chaos than in real life, he—accustomed to strictly-regulated business meetings—feels more. I know my daughter feels more comfortable with Zoom meetings than her husband does; I wonder if there is a difference between introverts and extroverts here as well?
I also find that meetings are more manageable if I listen more than I talk. I probably could learn something from that.
Years ago, I read of the experiences of a volunteer who moved to an impoverished country in an effort to make a positive difference in the lives of its suffering people. His initial observations led him to conclude that the community was indolent. They had no ambition, and preferred sitting around and chatting to making any kind of effort to improve their situation.
After sharing their lives for a while, however, he realized that they were not so much lazy as malnourished and exhausted. Living under a blazing tropical sun, with a diet deficient in both quality and quantity, and no access to medical care, it's a wonder they managed as well as they did.
I thought of that story when I re-read "The Luxury of Feeling Good" at The Occasional CEO.
There exists in our modern world the presumption—or maybe better—the luxury of feeling good. Some combination of the right food, enough sleep, exercise, aspirin and flu shots, and access to real medical care when required have been foundational to my decades in the workforce. ... I know there are unfortunate people who suffer without relief, but most of my co-workers through the years have been able to function comfortably on a daily basis thanks to the many blessings of modern life, from coffee to cold packs to dentists to Tylenol, that keep us upright and productive. What makes the luxury of feeling good so special is that we are among the very first generations of humankind to expect each day to be pain-free and generally comfortable.
I'm at the age where I no longer take health for granted. Too many of my friends are dealing with broken bones, replaced joints, arthritis, and even strokes and cancer. I ache more than I'd like, and even getting out of bed reminds me that my muscles and joints don't work as well as they once did.
Did I say I don't take feeling good for granted? Actually, I do. Most of the time I don't even think about it, till suddenly something hurts, and I start moaning and whining. Here's a glimpse of what the high achievers of generations not that far back had to put up with:
[Eli] Whitney entered Yale with forty-two other freshmen and graduated four years later with only thirty-eight living classmates; if my undergrad class had suffered death at the same rate, we would have lost 133 students of 1,400. On break between school terms, Eli himself nearly died of an unspecified disease, what his sister called “Hypo.” A few years later he was struck down with malaria, the effects of which incapacitated him time and again throughout his life. Then, barely recovered, he headed to New Haven, Connecticut, to commence manufacturing and found the town awash in scarlet and yellow fevers so virulent that he could not employ a steady workforce.
Joshua Lawrence Chamberlain [commanded] Union troops at the Second Battle of Petersburg in June 1864. Chamberlain was shot through the right hip and groin, a wound so serious he was given a deathbed promotion and recorded as deceased in Maine newspapers. ... With peace, he served four terms as the Governor of Maine and went on to become president of Bowdoin College. Chamberlain practiced law in New York City. He pursued real estate interests in Florida and railroad interests on the West Coast. At age 70 he volunteered for duty in the Spanish-American War but was rejected. He died at age 85 due to complications from the wound suffered at Petersburg.
Joshua Chamberlain had a full, rich, active, successful career. Nothing seemed to slow him down. But we also know that from the moment of his Petersburg wound in 1864, he was forced to use some kind of primitive catheter and colostomy bag. He underwent six operations to try to correct his wound. He suffered pains, fevers, and infections throughout most of his life. One of my friends at Gettysburg said, "I think Chamberlain had a urinary tract infection for the last fifty years of his life."
Have you ever had a urinary tract infection for a day? Did it make you want to run for governor?
Keep in mind that these are people whose sufferings and accomplishments have been recorded. Let's not forget the everyday men, women, and children who raised crops and reared children, put dinner on the table, endured long journeys, and built cathedrals, all without aspirin, let alone antibiotics.
This means that the last few generations in America have been blessed with enormous advantage. It's not just that many of us get up in the morning and "pursue our passion" instead of having to plow the fields or milk the cows. It's not simply that we can get warm in the winter and stay cool and productive in the summer, or that we have clean water to drink and indoor plumbing. Perhaps our greatest single advantage over prior generations is the ability to work and live comfortably and pain-free.
If you're browsing the toothpaste aisle of your local grocery store, would you do a double-take upon seeing this prominently displayed?
That's what happened to me several years ago when shopping in Switzerland. To this day I smile whenever I see it on a visit to Migros or Coop. It is a prime example of the need for companies to take care when exporting their products to other countries. Perhaps the best-known example is selling the Chevy Nova in Spanish-speaking countries: General Motors certainly didn't want prospective buyers to be thinking "doesn't go" with respect to their cars.
If the Swiss company that makes Candida toothpaste exports their product to English-speaking countries, I doubt it is under the same name. The thought of brushing my teeth with something that suggests a vaginal yeast infection does not inspire me to put this in my shopping cart. It is not much better to be reminded of thrush, a candida infection of the mouth.
I don't know what the makers of Candida were thinking when they chose that name, but it turns out that it's not as crazy as it sounds. Although this toothpaste appears to me to be marketed simply as a good dentifrice, there have been studies showing that certain toothpastes are effective in fighting oral candida infections. Here's a study that compared nine brands of herbal and conventional toothpaste (unfortunately, Candida was not among them) and concluded,
All toothpastes studied in our experiments were effective in inhibiting the growth of all C. albicans isolates. The highest anticandidal activity was obtained from toothpaste that containing both herbal extracts and sodium fluoride as active ingredients, while the lowest activity was obtained from toothpaste containing sodium monofluorophosphate as an active ingredient.
Now you know. Maybe the Swiss are onto something.
About once a year or so we actually go out to a theater and watch a movie. I knew I wanted to see Unplanned, and did not have any confidence that it would eventually make it to Netflix. So Porter bought tickets online for our local AMC theater, and we made a date of it.
"Date" is an appropriate word, because despite the seriousness of the subject and a couple of horrifying scenes that probably earned it its "R" rating, Unplanned is basically a love story: The unconditional love of parents for a child who has made lifestyle choices in complete opposition to their own deeply-held values; the steadfast love of a man in support of his wife despite his conviction that her chosen career path is an immoral one; the love that leads us to embrace our common humanity in the face of chasmic differences; and the relentless love of God for his hurting world—"unresting, unhasting, and silent as light."
Abby Johnson's desire to make a difference in the world, to support the rights of women, and to help women in crisis situations led her, beginning as a student volunteer at the local Planned Parenthood clinic, to a promising career with that organization. She became one of the youngest-ever clinic directors, and won an Employee of the Year award in 2008.
And then that same heart-felt desire to help women led her to quit. Unplanned is her story.
The story is well told. The movie is beautiful—except of course where it's ugly. I particularly like the fact that it is not a black-and-white, one-dimensional story of a sudden conversion, despite the "what she saw changed everything" subtitle. As much as can be done in a movie less than two hours in length, we see Abby's growth through time and experience. Her change of heart seems more of a tipping point than a crisis, though there are certainly elements of the latter as well. Abby at the end of the movie is more knowledgeable, more experienced, certainly less naïve, and moving in a different direction in more than one area of her life—but still Abby.
The only fault I find is the portrayal of Abby's boss, who is indeed one-dimensional; we never see her human side. It reminds me of what C. S. Lewis said about George MacDonald, that he was rare among authors in being able to portray good much better than evil: "His saints live; his villains are stagey." It's certainly possible that this woman was as nasty as she seems, and as I said, it's a short movie, but I would like to have seen something redeeming about her character.
Do I recommend seeing Unplanned when you have the chance? Absolutely, 100%, a hundred times over. Do I recommend it for our grandchildren? Eventually. They're all under age for the rating at this point, anyway. Maybe the oldest one or two could handle it well, if their parents watch the film first and agree. Anyone younger than that would be traumatized, maybe scarred for life—if they understood it at all. At first I wondered about the R rating, given the horrible things I've seen in PG-13 movies, but I believe the MPAA got it right in this case. Unplanned is a beautiful movie, and an important one, but there's no denying that it's disturbing in a way no child should be asked to handle. Not that so many kids haven't already seen worse. And it's rather bizarre to require parental consent for a child watch a movie with a few abortion scenes, when that same child could actually have an abortion without it.
It took a long time for me to dip my toes into the DNA testing waters, being both an avid genealogist and a very private person. But just as giving birth changed my relationship to modesty, starting a blog changed my relationship to privacy. I'm still both modest and private, but not in the same way. The biggest obstacle to DNA testing was knowing I was dragging my family along. As recent events have shown, criminal behavior (and other indiscretions) can be found out by DNA through relatives' information available on genealogy websites.
But I discovered long ago that privacy as we knew it is dead. I remember working with a family researcher who was writing a book on one side of our family. At one time, I would have refused to contribute any information, but had since been helped so much in my research by a book on the Wightman Family that I wanted to help others the same way. The Wightman book, incidentally, has information on me and our family that was contributed without my knowledge or consent. At the time I was not happy, but I got over that and now appreciate it. Except for where the data is wrong....
The point, however, is that while such direct contributions help researchers, they're not all that necessary. When one of my family members declined to contribute his family's information to the project I was helping with, the researcher understood his reluctance—but he added, "Let me show you the information I've already obtained from public sources." He already had just about everything he could use. As Illya Kuryakin Dr. Mallard said on NCIS last night, The Internet will be the death of us. Or at least of privacy.
In light of all this, Porter and I each decided to submit a sample to AncestryDNA.com, and eagerly awaited the results. Later we uploaded the DNA data to MyHeritage.com, and eventually gave another sample to 23andMe.com—the latter for both the ancestry and the health screening.
This post is not for a detailed analysis of the results, but an overall impression of the value of the DNA testing. First, from the point of view of genealogy.
For us, the Ancestry.com screening was the most useful. This is for two reasons.
- They have the largest database from which to work, and that is what makes the testing useful—comparing your DNA to that of other populations. For this reason it is also most useful for those of European background, because of the large numbers of that population who have participated. The testing services are working to improve the experience for under-represented populations, but for now the data is not so robust.
- I have uploaded our family tree, with its nearly 15,000 individuals, to Ancestry.com, and that's largely what makes their DNA service helpful for genealogy. This gives context to our DNA matches, and I've already confirmed known relatives while learning of several more. My tree is at the moment private on Ancestry, which means people have to ask me about the information, which is a good way to get to meet them. Someday I will make it, or at least a version of it, public, but the tree itself isn't ready for that exposure yet.
No doubt MyHeritage would be more useful if I put a tree up there as well, but that's on the "Someday/Maybe" list. I only uploaded our data because at the time they gave free access to their resources if you did. So far they've only found us "third-to-fifth cousins"—tons of them—which is not of much use without trees to compare, and most people seem to have no trees or very small ones. Third cousins share a great-great-grandfather, so it requires a significant amount of family history knowledge to make the connection.
23andMe is in the same situation as far as genealogy goes. So far nothing found even as close as second cousin (sharing a great-grandfather).
How has this helped my genealogy research? Well, through Ancestry.com I've connected with a few previously unknown cousins, a couple close enough to be useful in sharing information. Even the ones that are more distant have been useful in providing some confirmation of my research. Overall I'm glad I took the plunge, if only for this reason. It also has a lot of potential for more and better information as time goes on. One important caveat: There is a lot of error in online family trees. Even with DNA support, this information is best taken as inspiration for further research, and for mutual sharing of data sources.
Now for what most people want out of DNA testing: heritage and ethnicity information. This is an estimate only, and each company has its own data and algorithm for making its "best guess." Sometime after we had our samples analyzed, Ancestry.com upgraded their system and re-analyzed our data. The results were not terribly much different from the first attempt, though probably more accurate.
The analysis from MyHeritage was closer to Ancestry's original analysis. That from 23andMe was different from any of the others, though quite similar overall.
My impression? The DNA analysis is very good as an overall picture, not so good on the details. For example, Porter's great-grandparents came to the United States from Sweden, and it is well known where they lived before emigrating. In fact, when his dad visited Sweden, he was told he looks just like people who live in that area. Thus when his father's AncestryDNA analysis came back showing his largest ethnicity to be Norwegian, we were taken aback. However, the area he's from may be called Sweden, but it's right on the border with Norway. One can definitely say from his DNA that he is of Scandinavian origin, but that he is specifically Swedish comes from genealogy. One must also remember that the smaller percentages are suspect: of the three analyses, 23andMe was the only one that gave me "broadly East Asian and Native American" ancestry, and that was at just 0.1%, so highly doubtful.
Finally, there's the analysis of genetic health data. This comes primarily from 23andMe, though we also paid an extra $10 post facto for Ancestry's "Traits" screening. I've written about the latter experience before. 23andMe analyzes many more traits than Ancestry's small sample, from "Leigh Syndrome, French Canadian Type" carrier status, to estimated risk for late-onset Alzheimer's Disease, to Lactose Intolerance, to Asparagus Odor Detection.
My thoughts? Interesting, but not quite ready for prime time. Where I have independent data it sometimes confirms, sometimes contradicts the DNA reports. Ancestry says I likely have a "unibrow" but 23andMe says the opposite. Both of them say I probably hate cilantro, and I love it. And so on. So I'm taking the rest of what they say with a few grains of salt. I'm sure there's something to it, and that the data will get better with time, but for now it is more entertainment than useful information. Actually, I take that back: Just as DNA ancestry data is useful as a starting point for further research, the discovery of certain traits might be useful for suggesting further, medical, genetic testing.
There's a lot more to DNA analysis for the serious genealogy researcher to investigate, such as sites that will take your data and give you tools to learn much more about which particular genes you and a DNA match share. I'm not there yet; I have too much to do with my regular research to explore that path further. But it, and my data, are there when I'm ready.
Am I glad I decided to "spit in the tube"? Absolutely; I'd do it again and may later go further with it. I'm very grateful to family members who have taken the plunge as well, because that provides a look at the puzzle from more angles. But it's always important not to expect too much. It's never as simple as trading your kilt for lederhosen, as the Ancestry.com ad blithely shows. Plus there's a risk of finding out things you don't want to know—about family or about health. It's a very personal decision and I understand those who are reluctant to take the risk.
March 21 is World Down Syndrome Day.
Temple Grandin wrote:
It is likely that genius is an abnormality. If the genes that cause autism and other disorders such as manic-depression were eliminated, the world might be left to boring conformists with few creative ideas.
Down Syndrome is not genius, at least not in the intellectual sense. If I could wave my hand and eliminate that third copy of the 21st chromosome, I imagine I would do so. But would that be a good thing? The more I hear from families of children with Down Syndrome, the more I wonder if these people have something important to offer the world that shouldn't be thrown away.
Even if eliminating the genetic defect that results in Down Syndrome would be best for all concerned, I know for a fact that eugenics is not the right way to effect a cure.
The population of people with Down Syndrome is diminishing rapidly, not because someone has cured the condition, nor found a way to prevent its occurrence, but simply because more and more babies with Down Syndrome are killed before they have a chance to be born. Prenatal testing to determine the presence of that extra chromosome is widespread, and more and more parents are opting for abortion rather than meet this challenge.
It's not my place, here, to judge another person's response to a difficulty I have never faced. But as a society we need to be aware of exactly what we are doing. There have been other times in our history when we have made deliberate efforts to eradicate the "unfit," and those actions have been rightly condemned by subsequent generations.
I respect doctors, and am grateful for their skills, knowledge, and compassion. But that respect and gratitude are much the same as my feelings about teachers: individually and personally they can be fantastic, but as a bureaucracy (the medical/educational "establishment") I have serious doubts.
In my own life, the medical establishment's attack on my health began at birth. I don't know the details of my hospital birth, but I know the official policies were long on interference and very disrespectful of the natural birth process. What I know for certain was that my mother was discouraged from breastfeeding and told to feed me "formula," which in those days was a mixture of water, evaporated milk, and Karo corn syrup. (You read that right.)
Somehow i survived that abomination of an infant diet, which also included introducing solid foods at a few weeks of age. But it didn't stop there: I grew up right in the middle of the big push to get people to eat margarine instead of butter. My parents followed that recommendation, too—probably quite willingly, because margarine was so much cheaper than butter. I don't blame them for that, but I do blame the medical establishment for pushing margarine as far healthier than butter. Of course they now tell us just the opposite. Several years ago I made the switch back to butter, but not before exposing our own children to far too much margarine in their diets.
When I was young, my family also switched from drinking whole milk (delivered in glass bottles, with the cream risen to the top) to the skimmed variety, again at the recommendation of the doctors. That one stuck with me—to this day I prefer skim milk, and with skim milk we fed our children. But it would probably have been better if I had never lost my taste for milk with its full complement of fat and natural vitamins. Even the doctors no longer recommend skim milk, though they're still pushing less than the full 4% butterfat version.
I've lived long enough to see doctors insist that all babies must sleep on their backs, then that all babies must sleep on their stomachs, then back to their backs, then their sides...with never an apology for giving the "wrong" advice for so many years. I'm glad that my knowledge of official fickleness enabled me to stand firm in my own decision not to let flip-flopping doctors determine how our babies would sleep. At least we got that one right.
Now there are indications that the intense campaign to come between American skin and the light of the sun is causing problems much more severe and widespread than the skin cancer it's supposedly preventing. The push to slather sunscreen on every time we leave the house has resulted in widespread vitamin D deficiency, and a re-emergence of the bone disorder, rickets. Moreover, the article, Is Sunscreen the New Margarine?, makes the case that sun exposure is necessary for our cardiovascular health, especially for healthy blood pressure levels. Many doctors are now saying that we need to ease up on the sun-phobia, though it's still controversial.
One of the leaders of this rebellion is a mild-mannered dermatologist at the University of Edinburgh named Richard Weller. For years, Weller swallowed the party line about the destructive nature of the sun’s rays. “I’m not by nature a rebel,” he insisted when I called him up this fall. “I was always the good boy that toed the line at school. This pathway is one which came from following the data rather than a desire to overturn apple carts.”
Weller’s doubts began around 2010, when he was researching nitric oxide, a molecule produced in the body that dilates blood vessels and lowers blood pressure. He discovered a previously unknown biological pathway by which the skin uses sunlight to make nitric oxide.
It was already well established that rates of high blood pressure, heart disease, stroke, and overall mortality all rise the farther you get from the sunny equator, and they all rise in the darker months. Weller put two and two together and had what he calls his “eureka moment”: Could exposing skin to sunlight lower blood pressure?
Sure enough, when he exposed volunteers to the equivalent of 30 minutes of summer sunlight without sunscreen, their nitric oxide levels went up and their blood pressure went down. Because of its connection to heart disease and strokes, blood pressure is the leading cause of premature death and disease in the world, and the reduction was of a magnitude large enough to prevent millions of deaths on a global level.
Other studies have found more benefits of sun exposure.
Pelle Lindqvist, a senior research fellow in obstetrics and gynecology at Sweden’s Karolinska Institute... tracked the sunbathing habits of nearly 30,000 women in Sweden over 20 years. Originally, he was studying blood clots, which he found occurred less frequently in women who spent more time in the sun—and less frequently during the summer. Lindqvist looked at diabetes next. Sure enough, the sun worshippers had much lower rates. Melanoma? True, the sun worshippers had a higher incidence of it—but they were eight times less likely to die from it.
So Lindqvist decided to look at overall mortality rates, and the results were shocking. Over the 20 years of the study, sun avoiders were twice as likely to die as sun worshippers.
On the other hand,
“I don’t argue with their data,” says David Fisher, chair of the dermatology department at Massachusetts General Hospital. “But I do disagree with the implications.” The risks of skin cancer, he believes, far outweigh the benefits of sun exposure. “Somebody might take these conclusions to mean that the skin-cancer risk is worth it to lower all-cause mortality or to get a benefit in blood pressure,” he says. “I strongly disagree with that." It is not worth it, he says, unless all other options for lowering blood pressure are exhausted. Instead he recommends vitamin D pills and hypertension drugs as safer approaches.
Seriously? Vitamin D supplements have been shown to be ineffective, probably because there's more to the benefits of sun exposure than the vitamin. And can he honestly believe that exposure to the risks of hypertension drugs is better than a little sunshine? I generally take with a grain of salt the blanket pronouncements of some of my more radical friends that the medical industry has no interest in anything they can't make money from. In most of life I'm inclined to attribute bad effects more to ignorance than to evil intent. However, sometimes that optimism is shaken.
Me? I live in Florida. I know the power of the sun, and am grateful for sunscreen when I deem it necessary. All the doctors agree that sunburn is bad. But even in Florida I've always known that some sun exposure is important—another thing I think we got right with our kids.
I still feel guilty about the margarine, though.
Go Wild: Free Your Body and Mind from the Afflictions of Civilization by John J. Ratey and Richard manning (Little, Brown and Company, 2014)
I've neither the time nor the inclination for a full review of Go Wild, which I borrowed from the library while waiting for them to acquire Spark, another book by John Ratey, which was highly recommended by a friend. Fortunately, the friend said about Go Wild that she found it good but not worth paying for, so I'm still looking forward to Spark. I found Go Wild too annoying to call "good," but I am glad I read it, as there's a reasonable amount of inspiring information in it.
To begin with, the author pushes several wrong buttons for me, from the trivial to the overwhelming. As an example of the former, there's this (emphasis mine):
Even the child's song knows that the leg bone is connected to the thigh bone; we mean to press this idea a lot further to provide some appreciation of the enormous complexity and interconnectedness of the various elements of human life.
I'm sure he's referring to the spiritual, Dem Bones, which is not a child's song, even if it might end up in a collection of songs intended for children. And I know there are different versions, as there always are with songs of the people, but all the versions I've found acknowledge that the thigh bone is connected to the knee and the hip, not the "leg bone" (or "shin bone" as I know it). Yes, it's trivial—but to me it points to carelessness on the part of the author, which doesn't increase my confidence in what he says. (Or maybe I should blame his proofreaders.) There are other occasions where I get the same feeling.
Then there's this, which to me undercuts all his arguments: I'm fine with evolution as a scientific theory of origins and change. I'd go so far as to say it does an excellent job of explaining much of the available data. But I am not okay with evolution personified and deified, which is what happens in this book. All over, everywhere: "Evolution endowed," "evolution created," "evolution designed." Not only is evolution the basis for all the book's arguments, but the language makes evolution seem like a living, sentient, personal entity—though not, the authors are careful to point out, a loving one.
I was late in coming to the appreciation of religion, but I've always loved science. The religion of science horrifies me, however, and with that this book abounds. Add to that a significant dose of Eastern spirituality, and the feeling that the authors have been, perhaps, a little too selective in the studies they choose to believe—well, I wasn't too happy with the book.
It's also hard to take too seriously someone who—although he loves the outdoors and runs ultramarathons, will also drive 45 minutes to find a gym in the middle of nowhere.
That said, it's almost amazing that I found much of value here, but I did.
The authors cover a lot of ground. Here's a brief summary, although it doesn't come close to doing the ideas justice.
- Do what works for you. There is no one-size-fits-all. Take the first step in any of the areas they recommend changing, and you will find yourself gradually taking on more and more.
- Don't eat sugar in any form.
- Eat no wheat, rice, oats, or any other grain, not even in whole-grain form. No high-carb vegetables like sweet potatoes. No manufactured fats, no processed food, no fast food.
- Eat eggs, grass-finished beef, cold-water fish, nuts, simple fresh fruits and vegetables—but no fruit juices.
- Variety is important—as long as you avoid the long list of don'ts.
- Find a form of exercise you like, and do it.
- Exercise that invovles a variety of movements, the whole body, and lots of variation is best.
- Exercise is better out in nature.
- Exercise is better with other people.
- Get more sleep. If you live in 21st century America, it's guaranteed you're not getting enough sleep.
- Sleeping in the same room with the rest of your family is more healthful. (And we thought better sleep at the Maggie P. was due to the salt air.)
- Don't make your babies sleep alone.
- Soothing sounds, such as a crackling fire, or trusted adults moving around and talking quietly, lead to more satisfying sleep.
- Sleep doesn't have to happen all at once. Naps are fine. If you find yourself lying awake in the middle of the night, don't fight it, but get up and do something. Go back to bed later.
- The authors clearly admire Eastern spirituality, and thus promote the practice of meditation. But what they are trying to replicate is the relaxed hyper-awareness common among hunter-gatherer peoples, an ability to calm the brain of distractions while being alert—even more alert than otherwise. This turns out to be good for both brain and body health.
- Being out in nature is enormously healthful. Even an indoor potted plant helps.
- We need other people. We need our own "tribe."
- I wish he had dealt with the differences between introverts and extroverts in this section. We all need people, but they way we need each other is very different for the different personality types, and the authors appear to consider only the extrovert point of view.
As usual, this started out as the place to record a few interesting quotations, and ended up being a long review after all, though my summary did peter out at the end. There's a lot to think about here. I steadfastly reject the authors' extremes: for example, when it comes to food I am an omnivore by inclination but even more by principle, and I would no more adopt this no-carb regimen than I would go vegetarian. At the same time, it's good to eat a lot of vegetables, and it's also good to reduce our intake of carbohydrates, at least of the empty variety. I won't become a marathon runner, much less tackle an ultramarathon—but the book's thesis on the importance of movement is not only convincing, but provides inspiration to do things I've known for a long time that I should be doing.
Here are the random quotes:
Cows evolved to eat grass, but mostly we no longer feed them grass; we feed them the corn and soybeans that are the prime products of our industrial agriculture system. The practice of fattening beef in feedlots and the preponderance of factory beef in the fast-food system passes this omega-3 shortage into our bodies. ... [T]his is also why eating red meat itself has gotten a bad rap, with endless strings of studies linking it to heart disease and a variety of other issues. The beef that is the basis of these conclusions is factory beef, and no wonder.
Although I agree with the authors' complaint that the studies were made with the wrong kind of beef, they provide no evidence that beef from grass-fed cows does not have the same bad effects. I suspect that to be the case, but a citation of some evidence would have been nice.
[W]e begin to understand why social sleeping seems to be a nearly universal characteristic of cultures.... While we are sleeping, we continue to monitor our surroundings for cues of safety: relaxed conversation, relaxed movement of others, popping fire. Those cues, subtle sounds signaling safely, tell us we can retreat to our deepest sleep.
Many cultures are, in fact, conscious of all of this and the importance of these arrangements, and no place is the importance more pronounced than in the case of infants. ... All of this helps explain ... an almost universal perplexed response among most other cultures upon hearing of the Western practice of making babies sleep alone. "They think of this as child abuse. They literally do."
A very recent paper correlates an increase in the incidence of autism with receiving Pitocin during delivery. [Neurobiologist Sue Carter] says that Pitocin is routinely administered to delivering mothers in, she estimates, 90 percent of cases, although there are some signs that this practice is waning.
Why does aggression persist beyond reasons for it? Why are we so riven with senseless killing and warfare?
I picked up on that last one just because it highlights the central problem for people who have no sense of the reality of sin, only of its consequences.
The vagus nerve links up all the tools we need to respond to an existential threat, and so the vagal brake is a signal sent through the system for everything to stand down and engage—at ease. ... There is a simple measure of this. It can be read in the tension or lack of tension in facial muscles, heard in voice timbre and edge, and counted in rate of respiration. ... There is such a thing as vagal tone, completely analogous to muscle tone—and the tone shows how clear and distinct a given individual's ability to apply the brake is.
The vagal brake can be driven by breath, a clear connection readable as blips on a chart. You are in control of your breath, to some degree. Thus, this is not simply a point for measuring or sensing arousal; it is a point for controlling arousal and, downstream, the health problems that stem from lack of control.
If you force yourself to smile, the specific spots in the brain that register depression suddenly say your depression is better. ... It turns out that a halfway, forced smile won't do the trick, because it won't light up the neurons of increased happiness in your brain. But if that forced smile goes so far as to engage the little muscles in the corners of your eyes—that is, if you do what socially adept people understand instinctively—these neurons do indeed light up. And the muscles at the corners of your eyes are within the reach of the vagus nerve.
[The breath] exerts control through the alarm system that is the autonomic nervous system. [Researcher Stephen Porges] says he realized a long time ago—because he is a musician, specifically a horn player—that the act of controlling the breath to control the rhythm of music and at the same time engaging the brain to execute the mechanics of music works like a mental therapy. To his mind, it has all the elements of pranayama yoga, a form of yoga that stresses breath control.
The act of controlling the breath has a parallel brain response of calming our instincts for fear and danger. It's easy enough to see this in deliberate practices like yoga, but the same idea applies in many more time-honored practices: choral singing, Gregorian chants, even social music like bluegrass or blues derived from the chants and work songs that African slaves developed to help them tolerate oppression.
Music or evidence of music appeared fifty thousand years ago in that sudden flourish of evidence of cultural evolution that defined humans as humans—and ever since, music has loomed as a cultural universal. All known cultures and people make music. Yet all of this also suggests that we lose something when the crane's leg bone gets replaced by an iPod. We lose the benefits of sitting in a circle of fellow humans and driving the breath and beat that drives the music. [Emphasis mine]
As my friend said, Go Wild is worth reading—but not worth buying. If what I can only describe as bizarre spirituality—bizarre for a book that claims to be scientific—doesn't bother you, and if you can overlook the extremities, which are at their worst in the section on food, there are a number of interesting and worthwhile points.
My brother used to tell me that drinking orange juice was no better than drinking Coke, as it was no better than sweetened water.
Being a Floridian, that has rankled ever since.
It was brought to mind recently in a discussion with my nephew, the medical student, in which I heard him say that the recommendation for drinking juice was no more than two or three times a week. I may have heard the details wrong, because I don't see that when I look online for official recommendations, which are a bit more generous. Or it may be the newest medical-school thinking that hasn't yet been set in stone. But the upshot of the discussion was that whole fruits are good for you and should be encouraged, while fruit juice is bad for you, with no real benefits, and should be severely restricted. This opinion piece in the New York Times is an example of the bad rap juice is getting.
The doctors have good intentions, but I wouldn't be surprised if the real impetus behind this negative attitude towards juice comes from those who want to push soda consumption. After all, if orange juice isn't any better than Coke, why not drink Coke for breakfast, as the granddaughter of an acquaintance used to do?
The real question is: Why is juice so radically different from the whole fruit from which it is (supposedly) made, that the recommendations for consumption are polar opposites?
My answer is that what is called juice these days may have started as fruit, but has been so processed—strained, filtered, heated, added to and subtracted from, torn apart and put (somewhat) back together—that its source is no longer recognizable. Consider the following products:
- Oranges, freshly-picked from the tree, and reamed to extract the juice and much of the flesh
- Fresh orange juice that has not been pasteurized (I can buy this at local specialty stores, and also at Costco!)
- "Not from Concentrate" orange juice from the grocery store, which has been processed and pasteurized but at least looks like orange juice because it includes pulp
- #3 but without any pulp
- #3 or #4 with calcium added
- Orange juice from concentrate (John McPhee's book, Oranges, has a graphic description of what happens in that process)
- Orange juice drink, orange drink, orange-flavored drink, and other designations of something that may or may not have some real orange juice in it
- Tang and other pseudo-orange beverage mixes
The legal definitions are fuzzy—it's amazing what you can do to a product and still call it "orange juice"—and doctors rightly draw a line between #6 and #7, but say "orange juice" to the general public, and you could evoke thoughts of any of the above.
As far as I'm concerned, the list is in decreasing order of flavor. I suspect it is also in decreasing order of nutrition. But this definition of "juice" is so broad, even if you exclude #7 and #8, that it's useless. What do the doctors mean when they say "fruit is good, juice is bad"? Are they even considering how slippery the definition is?
This is orange juice.
It is juice I squeezed from oranges Porter picked from our own Page orange tree. Technically, the above statement is incorrect, because the Page orange is not a true orange, but a hybrid developed in Orlando in the 1940's that is 3/4 tangerine and 1/4 grapefruit. I should have said, This is citrus juice. I have no idea what the Food and Drug Administration would call it. I call it delicious.
Drinking this juice is not the same thing as eating the fruit, I'll grant. Some of the membranes are left behind in the juicing process. But a lot gets through, as you can see in this picture of the juice before I shook the bottle.
I'd say the experience is pretty close to eating the fruit. I acknowledge that the experience of drinking processed, grocery-store juice is radically different from that of eating fruit. However, the problem is not in the juice. The problem is in the processing, and the labelling.
Don't fight to eliminate juice. Fight to bring back real food!
The infamous Blue Screen of Death is all too familiar to my generation of Windows users. It may be that blue screens are now causing death in a different way.
This Popular Science article reports that prolonged exposure to blue light can cause irreversible damage to the cells that allow us to see. (And truly, I thought of the Blue Screen of Death analogy before I noticed that the article's author did, too.) That would be light from our televisions, computers, phones, e-readers, and even increasingly popular LED illumination.
Catastrophic damage to your vision is hardly guaranteed. But the experiment shows that blue light can kill photoreceptor cells. Murdering enough of them can lead to macular degeneration, an incurable disease that blurs or even eliminates vision.
Blue light occurs naturally in sunlight, which also contains other forms of visible light and ultraviolet and infrared rays. But ... we don’t spend that much time staring at the sun. As kids, most of us were taught it would fry our eyes. Digital devices, however, pose a bigger threat. The average American spends almost 11 hours a day in front of some type of screen, according to a 2016 Nielsen poll. Right now, reading this, you’re probably mainlining blue light.
Obviously, more research is needed before we panic about this. But maybe it's time I stopped putting myself to sleep by reading on my Kindle, or playing a move or two in Word Chums, or praying through our church's Prayer Chain list. They say you should turn off "devices" an hour before bedtime, because the blue light can keep you from falling asleep. That's never been an issue for me. But damaging my eyes? That's a much bigger issue.
So, a handful of people have gotten sick recently from eating salmonella-contaminated eggs from a farm in North Carolina. Salmonella, of course, can be a serious infection and is certainly not one even a healthy person wants to encounter. But who is writing the advice we are being given on how to handle these eggs should we be unfortunate enough to find them in our refrigerator?
Do not eat, serve, or sell these eggs; throw them away or return them for a refund, and be sure to disinfect the shelf on which they were stored.
Really? That kind of overreaction can only have been designed by hyper-sensitive doctors under the advice of their lawyers and malpractice-insurance companies. Why not just hard-boil the eggs? If you cook them until the white and yolk are both hard, you've killed the salmonella bacteria. Maybe I'd give them a couple of extra minutes, just because I can be a little paranoid that way.
And unless you're crazy enough to take your eggs out of the handy carton they come in and store them directly on your refrigerator shelf, I can't imagine why a shelf would need to be especially sanitized.
But hey, what do I know? I'm not a doctor, a biologist, a lawyer, an insurance company executive, or even a helicopter grandparent, so don't take this as advice.
Take it as yet another sign that common sense has been thrown out the window, and scare tactics rule the day—making us more and more inclined to miss the signal of an important warning amidst the noise of constant overreaction. Aesop warned over 2500 years ago of the dangers of crying "wolf."
Warning: sex stereotyping ahead. It's supposed to be funny, folks; don't take it too seriously.
How can you tell that men, not women, designed the birth control pill? Simple. I figured it out after reading Malcolm Gladwell's What the Dog Saw, in which he comments that it is not biologically necessary that birth control pills have an "off" week to induce menstruation; it was part of the design so that the woman's cycle would be more normal. But what is "normal" about menstruating every month? Young girls don't, older women don't, some top athletes don't, and more importantly, women who are pregnant or intensely breastfeeding usually don't, either. Here's the scenario as I see it:
Male researchers Let's see. Women who are pregnant don't ovulate, so if we manipulate a woman's hormones so that we mimic pregnancy, she won't ovulate, and can't get pregnant. This means we could have sex whenever we feel like it, without any sacrifice on our part, leaving the entire responsibility on women for whether or not they get pregnant. Yee-haw! But we won't really mimic pregnancy, in which a woman doesn't menstruate for at least nine months and sometimes two years or more, because, well, because it's natural for a woman to menstruate every 28 days.
Female researchers Let's see. Women don't menstruate while pregnant, and often don't while lactating, so if we manipulate a woman's hormones so that we mimic pregnancy, she need only menstruate once every year or two. Yee-haw! This means could go two years without experiencing the mood swings, intense pain, and mess? Bring it on! Wait, you say we ought to design this pill so that the fake pregnancy miscarries every 28 days? You must be C-R-A-Z-Y!
It was an irresistable headline: Nutritionist claims pizza can be a healthier breakfast than cereal.
I love breakfast. I could eat it for breakfast, lunch, and dinner. My current favorite morning meal is a large bowl of steaming oatmeal with dried fruit, though that may change with the weather.
Make that second-favorite. Pizza is always at the top of the list.
Blogger and dietitian Chelsey Amer caused a stir when [she announced] that a greasy slice of pizza is healthier than a bowl of cereal with milk. "You may be surprised to find out that an average slice of pizza and a bowl of cereal with whole milk contain nearly the same amount of calories,” Amer said. “However, pizza packs a much larger protein punch, which will keep you full and boost satiety throughout the morning."
Not that this is news to me, though it's nice to hear a nutritionist say it. The writer of the article, however, is less than enthusiastic, and spends most of his effort convincing us of ways to make cereal healthier.
New York-based dietitian Keri Gans says that cereal can be a perfectly healthy breakfast option — yes, healthier than pizza — as long as you’re smart about it. ... "If you choose the right cereal that’s packed with fiber, it may help lower cholesterol and control blood sugar. ... You could top your cereal with berries, which are rich in vitamins. ... you [can] work plenty of nutrition into your bowl — far more than you’d find on a dollar slice."
Well, sure, if you want to load the equation in favor of cereal. But you can do the same thing for the pizza. Skip the fast food version. Homemade pizza, whole-grain crust, good tomato sauce and cheese, lots of veggies.... But don't forget the pepperoni, if—like me—you consider it nearly essential to good pizza. Don't skimp on flavor, or it won't be satisfying and you'll eat more.
People tell me they couldn't move to Florida because they can't stand our bugs. Me, I'll take our giant cockroaches any day over ticks.
I grew up in Upstate New York. I spent much of my free time in the woods near our house, and hiked with my father all over the Adirondack Mountains. Never in my life did I see a tick of any sort until a visit to Connecticut after I graduated from college. Now, apparently, ticks are everywhere in the Northeast (and more). The worst a roach ever did to me was to scuttle into my bra when I was prone on the floor searching under the kitchen cupboards. The worst a tick has done to me was to give my little grandson Lyme disease, a far more serious, and much less amusing, situation.
Ticks freak me out. I don't know where this infestation came from, and I'm not happy about it.
But just when I started thinking that "extinction is forever" would be a great idea for all tick species, I read this: Oxford University researchers say ticks are a "gold mine" for new drugs.
It's possible that the extinction of any species, even the most apparently useless, annoying, or even dangerous, deprives us of some great, as yet undiscovered, benefit.
Brain on Fire: My Month of Madness by Susannah Cahalan (Simon & Schuster, 2012)
I enjoy reading medical stories, but they carry a risk: it's all too easy for me to look over my shoulder and imagine the patient's symptoms creeping up on me. It's a good thing that anti-NMDA-receptor autoimmune encephalitis is primarily a young person's disease.
This rare and bizarre condition looks for all the world like a severe psychiatric disorder, but occurs when something provokes a person's immune system to attack his brain. What, why, and how are still unknown, but it's usually curable, if caught and treated—a very expensive process—in time. Susannah Cahalan was the 217th person to be diagnosed with this disease, and if she had not been in the right place at the right time, would probably have been committed to a mental hospital for the rest of her shortened life. If she had had his strength, she could easily have played the part of the Gadareme demoniac.
Thanks mostly to being at a great hospital (NYU), and ending up (after several false starts) with just the right doctors, Cahalan made a full recovery. But while anti-NMDA-receptor autoimmune encephalitis and similar brain disorders are now much more likely to be caught than they were in 2009 when Cahalan fell ill, this is still a cautionary tale of the importance of second (or third or fourth) opinions, and of searching for physical causes for abnormal mental conditions. Autism and schizophrenia are just two of the diagnoses that are sometimes erroneously given to patients with these autoimmune disorders. Unfortunately, the specialized tests needed for proper diagnosis are currently too invasive and too expensive to be used routinely.
Brain on Fire is a gripping, well-written, and important book—even if, once again, I found myself regretting the demise of the censor's blue pencil.