I didn't expect to like this Wall Street Journal article about the board game, Risk. Unlike nearly all the rest of my extended family, I am not a fan of most board games, especially if they involve intricate strategy and take a long time to complete. It's even worse if I'm playing with people who care whether they win or lose. If I ever played Risk, it wasn't more than once.
But I enjoyed the article, and I understood most of it because of having been surrounded by so many people who love to play the game. The author makes a good case that playing the game taught many of us "everything we know about geography and politics."
A certain kind of brainy kid will reach adulthood with a few general rules for foreign policy: Don’t mass your troops in Asia, stay out of New Guinea, never base an empire in Ukraine. It is the wisdom of Metternich condensed to a few phrases and taught by the game Risk.
The game could be played with up to six players, each representing their own would-be empire, and could last hours. The competition could turn ugly, stressing friendships, but we all came away with the same few lessons. ... In the end, no matter who you call an ally, there can only be one winner, meaning that every partnership is one of convenience. If you are not betraying someone, you are being betrayed. Also: No matter what the numbers suggest, you never know what will happen when the dice are rolled. ... Regardless of technological advances, America will always be protected by its oceans. It is a hard place to invade. What they say about avoiding a land war in Asia is true. It is too big and desolate to control. Ukraine is a riddle ... stupid to invade and tough to subdue because it can be attacked from so many directions, making it seem, to the player of Risk, like nothing but border.
Here's my favorite:
The best players ask themselves what they really want, which means seeing beyond the board. I learned this from my father in the course of an epic game that started on a Friday night and was still going when dawn broke on Saturday. His troops surrounded the last of my armies, crowded in Ukraine. I begged for a reprieve.
“What can I give you?” I asked.
He looked at the board, then at me, then said, “Your Snickers bar.”
“My Snickers bar? But that’s not part of the game.”
“Lesson one,” he said, reaching for the dice. “Everything is part of the game.”
And finally, one amazing side note. The man who invented Risk, French filmmaker Albert Lamorisse, also created the award-winning short film, The Red Balloon.
Permalink | Read 563 times | Comments (0)
Category Education: [first] [previous] [next] [newest] Politics: [first] [previous] [next] [newest] Children & Family Issues: [first] [previous] [next] [newest] Everyday Life: [first] [previous] [next] [newest] Just for Fun: [first] [previous] [next] [newest]
Back in the 1970's, I worked at the University of Rochester Medical Center in Rochester, New York. One of my favorite things to do on my lunch break was to wander over to the Neonatal Intensive Care Unit of the associated Strong Memorial Hospital, and watch in admiration as the tiny children fought for their lives. Actually, there were some pretty big infants, too—babies born to diabetic mothers, weighing in at 14 or 15 pounds at birth, but with dangerous complications. My favorites were always the twins, which were commonly born early, and extra small. Not every family had a happy ending, but the best days were when our small "charges" disappeared from view because they had graduated out of the NICU.
I was thinking about this recently because of this story, out of Canada: Doctor Said Mom's Efforts to Save Her Babies Were a "Waste of Time," Now they're 3 and Thriving.
A mom from Canada who went into labor with twins at just shy of 22 weeks gestation was told by her doctor that they would die the day they were born. However, she refused to give up on her babies, and against the odds, her baby girls pulled through, heading home after 115 days in the NICU.
“When I went into labor, the doctor told me, 'The twins will be born today and they will die,'" she said. "I said, 'Excuse me?' and she said, 'Babies this gestation simply do not survive. It’s impossible.' ... She told me she wouldn’t let me see the twins, or hear their heartbeats, because it was a 'waste of time.'"
After four painful days of abysmal care at the unnamed Canadian hospital,
A new doctor entered the room and informed the couple that they could transfer to a London, Ontario, hospital to deliver the twins. ... Luna and Ema were born in London at 9:12 and 9:29 p.m., respectively. Luna weighed just over 14 ounces (approx. 0.39 kg) and measured 11 inches long; Ema weighed 1 pound (0.45 kg) and measured 12 inches long.
The twins were in the NICU for a total of 115 days and were discharged even before their due date. ... Today, the twins are thriving at 3 years old [and] are developmentally caught up to their full-term peers.
Forty years ago, the staff at "our" NICU had told us that they had saved babies born as early as 20 weeks and weighing less than a pound, and expected to continue to improve outcomes and to push the boundaries back. Forty years! I know there has been a lot of progress made in the care of preterm babies since then, primarily from the story of friends-of-friends quintuplets born ten years ago in Dallas.
So how is it that doctors and hospitals are condemning little ones like this to death, and consider 22 weeks' gestation a minimum for survival—and even then only at a few, specialized hospitals. What has hindered the progress Strong Hospital's doctors had so eagerly anticipated?
I can think of a few roadblocks. Number one, perhaps, is that we like to think that progress is inevitable. But there's no little hubris in that. Progress is not guaranteed over time, nor is it consistent.
Then there are funding priorities. Adequate financing may not be a sufficient condition for making progress, but it's a necessary one. Has improvement in preterm baby care been a funding priority over the last 40 years?
And of course there's the most difficult problem of all. Do we, as a society, as a country, as the medical profession in general—do we really want to save these babies? They cost a lot of money: for research, for facilities, for high-tech care, for months in the hospital, and often for special education and care throughout their lives, since babies on the leading edge of the survival curve are at higher risk for lifelong difficulties.
Most of all, does the idea of saving the lives of earlier and earlier preterm babies force us to consider the elephant in the room? How long can a society endure in which we try desperately to save the life of one child of a certain age, while casually snuffing out the life of another child of the same age, based solely on personal choice?
Having overheard someone questioning why Coventry Carol was included in our church's Lessons and Carols service earlier this month, I knew it was time to reprise our story of why this song of immeasurable grief belongs in this season of festive joy.
Coventry Carol is an ancient song that tells a story almost as old as Christmas. The events take place sometime after the birth of Christ—after the arrival of the Wise Men, from whom King Herod learns of the birth of a potential rival, and decides to do what kings were wont to do to rivals: kill him. Don't know which baby boy is the threat? No problem, just kill them all.
This song is a lament, a lullaby of the mothers of Bethlehem, whose baby boys would be killed in what came to be called the Massacre of the Innocents. (Jesus escaped, Joseph having been warned in a dream to get out of Dodge; the others are considered the first Christian martyrs—people whose association with Jesus led to their deaths.)
Lully, lullay, Thou little tiny Child,
Bye, bye, lully, lullay.
Lullay, thou little tiny Child,
Bye, bye, lully, lullay.
O sisters two, how may we do,
For to preserve this day
This poor youngling for whom we do sing
Bye, bye, lully, lullay.
Herod, the king, in his raging,
Charged he hath this day
His men of might, in his own sight,
All young children to slay.
That woe is me, poor Child for Thee!
And ever mourn and may,
For thy parting neither say nor sing,
Bye, bye, lully, lullay
Why sing such a gloomy song at Christmas?
Several reasons, maybe. Chief of which is that the Christian Christmas is not like the secular Christmas. It is, indeed, "tidings of great joy," but it is complicated, messy, profound, anything but simplistic and lighthearted. It breaks into the midst of a broken world, and even Jesus' escape from death here is only a short reprieve. There's more to Christmas than the joy of new birth, or even "peace on earth, good will to men." We have to tell the whole story.
Twenty years ago, as the world was beginning in earnest to "ring out the tidings of good cheer," our firstborn daughter gave birth to our first grandchild.
Isaac lived two days.
It was in that season of unspeakable grief that the haunting Coventry Carol touched me as none other could. Frankly, I could not handle all the happy songs about a newborn baby boy; with Coventry Carol I felt merged into an ancient and universal grief, the grief that made Christmas necessary.
Until the Day when all is set right, there will be pain and grief that won't go away just because the calendar says it's December. The last few years, especially, have wounded us all and broken not a few. This reminder that the First Christmas was not a facile Peace on Earth and Joy to the World, and that the first Christian martyrs were Jewish children, is for all whose pain threatens to overwhelm them.
Blessed are they that mourn: for they shall be comforted.
Permalink | Read 621 times | Comments (4)
Category Children & Family Issues: [first] [previous] [next] [newest] Everyday Life: [first] [previous] [next] [newest] Music: [first] [previous] [next] [newest] Inspiration: [first] [previous] [next] [newest]
Maybe you smoked some pot when you were young. Or know that your parents did. I did not, except second-hand and co-mingled with tobacco smoke, back in the days when our college movie theater—along with nearly everywhere else—put no restrictions on polluting the indoor air. I saw no reason to foul my lungs and risk fouling my brain. Maybe you think you survived your experiences unscathed. Maybe you did—though you will never know.
So maybe you think marijuana is harmless, remembering the fuss and scare-mongering from your youth. Maybe you are thrilled that in many places marijuana has "gone legit." But this is not your father's weed. Perhaps you thought that legalizing marijuana would take it out of the hands of the drug dealers, that it would be purer and safer.
Apparently not.
Truly, the love of money is a root of all sorts of evil. It seems we have not supplanted the illegal drug dealers and dishonest suppliers, but rather supplemented them with equally greedy mega-businesses, and replaced the lone marijuana plant or two growing in someone's apartment with chemical factories producing ultra-high-potency products that can maim and kill.
Here are two links to one family's story, the tragedy that alerted me to the problem.
Mila's Story, on Heather Heying's Natural Selections substack, and What Happened to Our Daughter; the latter is from the family's Slowdown Farmstead substack and tells the same story slightly differently, with more details about the drug problem (and lots of references). Be sure to notice how quickly Mila's mind disintegrated after her first encounter with the drug.
It wasn't just the marijuana that killed Mila. Suicide is always a complex event, with more than one contributing factor.
When you read Mila's story, you'll see that there's no shortage of guilty parties: the school drug counsellor to whom Mila went for help against the addiction that she knew was destroying her, whose response was merely to advise her to "moderate her use"; the First Nations reservation that supplied the dangerous drug "pens" to children, against which the Canadian government was apparently powerless; and most of all, the Canadian governments (federal and provincial) whose draconian COVID-19 restrictions left vulnerable high school students with literally nothing to do and no place to go. The Devil had a field day with those idle hands and minds.
We are just beginning to recognize what is certain eventually to be acknowledged as the truth: that the COVID closures, lockdowns, and travel restrictions, along with masking, social distancing, and vaccine mandates, have destroyed more individuals, families, and relationships than the COVID virus ever did.
Permalink | Read 508 times | Comments (0)
Category Hurricanes and Such: [first] [previous] [next] [newest] Health: [first] [previous] [next] [newest] Politics: [first] [previous] [next] [newest] Children & Family Issues: [first] [previous] [next] [newest]
YouTube is not exactly reliable when it comes to recommending videos for me to watch, but look what showed up in my sidebar tonight:
As most of my readers know, I'm a huge fan of J.R.R. Tolkien's Lord of the Rings books, but not of the movies for a number of reasons. Even though I feel the film story line and characterization are a betrayal of the spirit Tolkien put into his world, I can't deny that there are parts of the movies that are excellent, from the New Zealand setting to the music, and of course I adore this version.
Permalink | Read 505 times | Comments (1)
Category Children & Family Issues: [first] [previous] [next] [newest] Everyday Life: [first] [previous] [next] [newest] Just for Fun: [first] [previous] [next] [newest] Music: [first] [previous] [next] [newest] Inspiration: [first] [previous] [next] [newest]
There are a number of people—I certainly am one of them—who strenuously object to being unwilling medical guinea pigs in the matter of the COVID-19 vaccines.
I'm all for medical research, worked as part of a medical research team, and have been a willing human guinea pig in a few experiments myself. This work, when done carefully, knowledgeably, and ethically, is an essential part of scientific and medical advancement. But the ethically part is essential, and I don't think it's ethical to "enroll" masses of people in experiments for which there cannot possibly be adequate knowledge of the risks, and thus they cannot possibly give "informed consent." Plus, when there is no documented, adequate control group, not to mention that the experimenters have done their best to make sure there cannot be an adequate control group—well, then you've lost good science as well as ethics.
You're thinking I'm talking about the COVID-19 vaccines here, and I am—but that's not all. I don't know how many times we've been unknowingly subjected to these unethical experiments, but I do know that it has happened at least two other times in my lifetime.
Aspirin used to be the standard, go-to medication for children, even babies, with fevers or discomfort. I vividly remember the doctor recommending alternating doses of aspirin and acetaminophen when my infant daughter had a stubborn high fever. This was in the early 1980's, and for most people it worked just great. However, there appeared to be a possible correlation between aspirin use in children and young teens, in combination with a viral illness (often chicken pox), and a rare but sometimes fatal condition called Reye Syndrome. We had many doctors among our coworkers, and had no reason not to believe what they told us at the time: The decision to tell doctors and parents to avoid giving aspirin to children was a deliberate, national experiment: They thought aspirin caused Reye's Syndrome in children, but they couldn't prove it, so they hoped that if aspirin use went down dramatically, and so did the incidence of Reye, their point would be made. The disorder did, indeed, retreat significantly, whether through causation or merely correlation is still unknown. The cynic in me insists on pointing out that, whatever the stated reasons for this massive non-laboratory experiment, and whatever good might or might not come of it, one clear result was that a cheap, readily-available, and highly effective drug was massively replaced by one still under patent. The patent for acetaminophen (Tylenol) did not expire until 2007, and Tylenol was still reeling from the 1982 poisoned-Tylenol-capsules scare. Practically overnight, and with timing highly favorable to the pharmaceutical industry, Tylenol became the drug of choice for a large segment of the population.
The next example I remember of such a huge, non-controlled experiment happened in the early 1990's, and was not a drug but a parenting practice: the insistence by the medical profession that all babies never be allowed to sleep on their stomachs. Sleep position recommendations have flip-flopped several times over the years. The professionals never think it safe to leave that decision up to the babies and their parents, they just keep changing what it is that is "the only safe way for a baby to sleep." Personally, I think "whatever helps the baby sleep best" is almost always the right choice. (But I am not a doctor, nor any other medical professional, so make your own choices and don't sue me.)
Early in the 1990's the thought was that back-sleeping might reduce the incidence of Sudden Infant Death Syndrome (SIDS). Indeed, there was a decline after the "Back to Sleep" push went into effect, though once again the experiment was unscientific with no significant control group. Certainly there were still parents who put their babies to sleep on their stomachs, but if there was any widespread study of them I never heard of it, and indeed the data was necessarily corrupted because the pressure was so great not to do so that few parents talked openly about it. And doctors, even if they were well aware of the advantages of stomach-sleeping, could not risk mentioning them to their patients. I remember vividly the one young mother who, months later, confessed to the pediatrician that her son had always slept on his stomach. The doctor laughed, saying, "Of course I knew that! Look at how advanced he is, and look at the perfect shape of his head!" But stomach-sleeping is still very much a "don't ask, don't tell" situation.
These massive, uncontrolled, and to my mind unethical experiments on the human population are justified in the minds of many because, after all, they "did their job." Deaths from Reye Syndrome, SIDS, and COVID-19 have all fallen, so who cares how we got there?
Well, I care—and so should anyone who believes in the scientific method, the Hippocratic Oath, and open, honest, and ethical research.
Permalink | Read 594 times | Comments (0)
Category Hurricanes and Such: [first] [previous] [next] [newest] Health: [first] [previous] [next] [newest] Politics: [first] [previous] [next] [newest] Children & Family Issues: [first] [previous] [next] [newest]
Something unusual happened in our water aerobics class.
I had fun. I had fun participating in something resembling a sport.
So what? Well, here's the big deal: I don't think that has happened since elementary school.
I loved physical activity back then. Sports, even. Soccer, kickball, dodge ball, volley ball, gymnastics, trampoline. I even enjoyed the since-much-maligned Presidents Physical Fitness test. I was one of the best in school at swarming up a rope to the ceiling. After school, the neighborhood kids played active games, usually until dark. I was reasonably strong and fit—most children were, in those days—and loved active play.
What happened? Don't say I got old, or busy, though of course I did both. Don't blame it on phones or computers; this was long before these became part of my life.
Physical activity changed. Sports changed. Most people adapted; I didn't.
Back in my day, soccer wasn't the organized sport it is today for even the youngest. We had goals, we had a ball, we had a few basic rules (e.g. "no hands"), and we had a gaggle of kids roughly organized into two "teams." What we did, what I loved, was to run madly up and down the field, trying to kick the ball into the goal. Except for goalie, there were no assigned positions; it was literally a free-for-all. No one today would deign to call it soccer. But it sure was fun.
Volleyball was similar. Again, we had two teams—their composition always changing—a net, a ball, and a few basic rules. But no assigned positions. Serving, but little setting. Just a madcap "let's hit the ball over the net." And I loved it.
For many other people, the eventual organization of sports, honing of skills, multiplication of rules and tactics, and emphasis on competition made the games more fun. The rest of us, I guess, simply dropped out, to the detriment of both our physical and our mental health.
Which is why I was so excited when our instructor suddenly decided that Thursdays would be play days. She gave us small beach balls, and paddles, and organized us very loosely in games of no recognizable sport, but which—in groups, in pairs, and individually—challenged us to use our muscles in ways we hadn't used in a long time: reaching, jumping, running; increasing our strength, agility, and hand-eye coordination—all those things that sports are good for.
Perhaps best of all, when we played together, we became people to each other, not just a group of individuals gathered for healthful exercise. We looked at each other, we made eye contact, we worked together to make sure everyone was included and benefitting.
I was a kid again.
The following is a Dark Horse clip about the significant increase in myopia in children, as reported in this Atlantic article. Bret and Heather have issues with the article, but confirm the myopia problem and have their own theories about it. And, at the end, about orthodontia. It's 30 minutes long—and there's a section in the middle where they spend maybe too much time on the concept of "heritability"—so if you can stand it, you may want to speed up the playback. But I highly recommend watching the video, particularly to parents who are concerned about their children's eyes and teeth. I guess that would be all parents....
As I've said before, Bret and Heather are not always right, and sometimes dangerously wrong. But they are always interesting, and impressive in their quest for truth and their willingness to follow where it leads them, regardless of the popularity of their opinions.
Since COVID isn't so much of a problem in New York City anymore, Mayor Eric Adams and New York City Health & Hospitals CEO Dr. Mitchell Katz have come up with a new way to terrorize those who must be admitted to a Big Apple hospital. At the moment, it's just three facilities: H+H/Lincoln, Metropolitan, and Woodhull Hospitals, but it's feared the contagion may spread.
If you're unfortunate enough to be admitted to one of those hospitals, keep an eye on your dinner plate.
Culturally diverse plant-based meals are now the primary dinner options for inpatients.
Don't panic, NYC residents and visitors. I'm here to reassure you that this problem is not actually new, and there are ways around it.
Back in the mid-1980's, when we moved to Florida, we were warned that our local hospital was run by Seventh-Day Adventists, and consequently meat was never on the menu. The solution, we were told, was to be sure that your doctor provided you with a prescription for meat. I have no idea if making it a prescription increased the cost of meals fifty-fold, or if any insurance plans covered it. But we were assured that the hospital honored the doctors' orders, and the kitchen staff even did a better-than-usual job of preparing the special meals.
Apparently the same work-around will be honored in New York.
Non-plant-based options continue to be available and are offered in accordance with a patient’s prescribed diet.
Choose your doctor well.
Permalink | Read 509 times | Comments (0)
Category Health: [first] [previous] [next] [newest] Politics: [first] [previous] [next] [newest] Children & Family Issues: [first] [previous] [next] [newest] Food: [first] [previous] [next] [newest]
It started innocently enough, with an e-mail from AncestryDNA informing me of an additional trait revealed by my genetic makeup: my inclination to seek out or to avoid risky behavior. I could have predicted the result: I definitely prefer to avoid risk. Except, of course, that if I were as risk-averse as they say I am, I wouldn't be about to write something that could get me cancelled by Facebook.
The trigger was in one of Ancestry's explanatory paragraphs:
The world around you also affects your appetite for risk. Younger people and folks who were assigned male at birth report taking the most risks, which may be influenced by environmental factors like social rewards. Some influences are closer to home, like whether your parents encouraged risk taking. Also, our popular understandings of risk may skew more toward physical and financial risks than emotional ones. It's not only risky to do things like step onto a tightrope blindfolded. It's also risky to be honest about your feelings, admit ignorance, and express disagreement. In other words, it's risky to be yourself.
Really, Ancestry? Folks who were assigned male at birth? You mean men? If there's one place I'd expect to be free from this massacre of language, not to mention of reality, it would be a company that makes its money telling people about their chromosomes. When the attendants at my birth announced, "It's a girl!" they were not assigning my sex, they were revealing it, and AncestryDNA should know that better than anybody. Is there any point in trusting the other things they say about my genetics if they think that whether I was born with XX or XY chromosomes is something that was chosen by the birth attendants? Maybe the doctor determined my skin color, too? And the nurse decided I would be right-handed? Humbug.
When American women began coloring their hair, the object was to appear natural (no purple!). Clairol's popular commercial advertising their product contained the catchphrase, "Only her hairdresser knows for sure." My ophthalmologist amended that to, "... and her eye doctor."
While examining my eyes, he had casually announced, "You're actually a blonde." My hair, at that time, was brown, with a smattering of grey. All natural, I might add. A towhead as a child, I had gradually morphed into a brunette. Or so I thought.
"How do you know that?" I questioned.
"You have a blonde fundus. You can dye your hair and fool most people, but your eyes know the truth."
A brand-new story is on its way from S. D. Smith, creator of the Green Ember series. There's a trailer at JackZulu.com, and I'm sure more information will follow.
In the meantime, two of the Green Ember books are currently free for Kindle, with more to come next week. But really, the regular Kindle prices are so low, it's not worth stressing of you miss the sales.
I've been writing these essays for more than 20 years. As with all writers (and other artists), I often look back on my work and shudder. Sometimes, however, I'm okay with what I've written. But how often does someone see a blog post from 2010? Current events may not be relevant anymore and can reasonably be forgotten, and most people don't care about our everyday lives. But I've also said a lot that I think bears repeating; book reviews, for example, are almost always just as useful now as they were then. So I'm going to start to bring back some of my favorites, not only because I believe they'll be useful to what is mostly a whole new audience, but also because I need to be reminded of the content myself.
I'll begin with the fascinating, and important, idea of neuroplasticity, which I first wrote about on May 18, 2010.
The Brain that Changes Itself: Stories of Personal Triumph from the Frontiers of Brain Science by Norman Doidge (Penguin, New York, 2007)
Neuroplasticity.
The idea that our brains are fixed, hard-wired machines was (and in many cases still is) so deeply entrenched in the scientific establishment that evidence to the contrary was not only suppressed, but often not even seen because the minds of even respectable scientists could not absorb what they were certain was impossible. Having been familiar since the 1960s with the work of Glenn Doman and the Institutes for the Achievement of Human Potential, the idea that the human brain is continually changing itself and can recover from injury in astonishing ways did not surprise me. In fact, the only shock was that in a 400 page book on neuroplasticity and the persecution of its early pioneers I found not one mention of Doman's name. But the stories are none the less astonishing for that.
In Chapter 1 we meet a woman whose vestibular system was destroyed by antibiotic side-effects. She is freed by a sensor held on her tongue and a computerized helmet from the severely disabling feeling that she is falling all the time, even when lying flat. That's the stuff of science fiction, but what's most astounding is that the effect lingers for a few minutes after she removes the apparatus the first time, and after several sessions she no longer needs the device.
Chapter 3, "Redesigning the Brain," on the work of Michael Merzenich, including the ground-breaking Fast ForWord learning program, is worth the cost of the book all by itself.
Sensitive readers may want to steer clear of Chapter 4, "Acquiring Tastes and Loves," or risk being left with unwanted, disturbing mental images. But it is a must read for anyone who wants to believe that pornography is harmless, or that our personal, private mental fantasies do not adversely affect the very structure of our brains.
The book is less impressive when it gets away from hard science and into psychotherapy, as the ideas become more speculative, but the stories are still impressive.
Phantom pain, learning disabilities, autism, stroke recovery, obsessions and compulsions, age-related mental decline, and much more: the discovery of neuroplasticity shatters misconceptions and offers hope. The Brain that Changes Itself is an appetizer plate; bring on the main course!
For those who want a sampling of the appetizer itself, I'm including an extensive quotation section. Even so, it doesn't come close to doing justice to the depth and especially the breadth of the book. I've pulled quotes from all over, so understand that they are out of context, and don't expect them to move smoothly from one section to another.
Neuro is for "neuron," the nerve cells in our brains and nervous systems. Plastic is for "changeable, malleable, modifiable." At first many of the scientists didn't dare use the word "neuroplasticity" in their publications, and their peers belittled them for promoting a fanciful notions. Yet they persisted, slowly overturning the doctrine of the unchanging brain. They showed that children are not always stuck with the mental abilities they are born with; that the damaged brain can often reorganize itself so that when one part fails, another can often substitute; that if brain cells die, they can at times be replaced; that many "circuits" and even basic reflexes that we think are hardwired are not. One of these scientists even showed that thinking, learning, and acting can turn our genes on or off, thus shaping our brain anatomy and our behavior—surely one of the most extraordinary discoveries of the twentieth century.
In the course of my travels I met a scientist who enabled people who had been blind since birth to begin to see, another who enabled the deaf to hear; I spoke with people who had had strokes decades before and had been declared incurable, who were helped to recover with neuroplastic treatments; I met people whose learning disorders were cured and whose IQs were raised; I saw evidence that it is possible for eighty-year-olds to sharpen their memories to function the way they did when they were fifty-five. I saw people rewire their brains with their thoughts, to cure previously incurable obsessions and traumas. I spoke with Nobel laureates who were hotly debating how we must re-think our model of the brain now that we know it is ever changing. ... The idea that the brain can change its own structure and function through thought and activity is, I believe, the most important alteration in our view of the brain since we first sketched out its basic anatomy and the workings of its basic component, the neuron.
In Chapter 2, Building Herself a Better Brain, a woman with such a severe imbalance of brain function that she was labelled mentally retarded put her own experiences together with the work of other researchers to design brain exercises that fixed the weaknesses in her own brain ... and went on to develop similar diagnostic procedures and exercises for others.
The irony of this new discovery is that for hundreds of years educators did seem to sense that children's brains had to be built up through exercises of increasing difficulty that strengthened brain functions. Up through the nineteenth and early twentieth centuries a classical education often included rote memorization of long poems in foreign languages, which strengthened the auditory memory (hence thinking in language) and an almost fanatical attention to handwriting, which probably helped strengthen motor capacities and thus not only helped handwriting but added speed and fluency to reading and speaking. Often a great deal of attention was paid to exact elocution and to perfecting the pronunciation of words. Then in the 1960s educators dropped such traditional exercises from the curriculum, because they were too rigid, boring, and "not relevant." But the loss of these drills has been costly; they may have been the only opportunity that many students had to systematically exercise the brain function that gives us fluency and grace with symbols. For the rest of us, their disappearance may have contributed to the general decline of eloquence, which requires memory and a level of auditory brainpower unfamiliar to us now. In the Lincoln-Douglas debates of 1858 the debaters would comfortably speak for an hour or more without notes, in extended memorized paragraphs; today many of the most learned among us, raised in our most elite schools since the 1960s, prefer the omnipresent PowerPoint presentation—the ultimate compensation for a weak premotor cortex.
Here are several (but not enough!) from my favorite chapter, "Redesigning the Brain."
[As] they trained an animal at a skill, not only did its neurons fire faster, but because they were faster their signals were clearer. Faster neurons were more likely to fire in sync with each other—becoming better team players—wiring together more and forming groups of neurons that gave off clearer and more powerful signals. This is a crucial point, because a powerful signal has greater impact on the brain. When we want to remember something we have heard we must hear it clearly, because a memory can be only as clear as its original signal.
Paying close attention is essential to long-term plastic change. ... When the animals performed tasks automatically, without paying attention, they changed their brain maps, but the changes did not last. We often praise "the ability to multitask." While you can learn when you divide your attention, divided attention doesn't lead to abiding change in your brain maps.
Somewhere between 5 and 10 percent of preschool children have a language disability that makes it difficult for them to read, write, or even follow instructions. ... [C]hildren with language disabilities have auditory processing problems with common consonant-vowel combinations that are spoken quickly and are called "the fast parts of speech." The children have trouble hearing them accurately and, as a result, reproducing them accurately. Merzenich believed that these children's auditory cortex neurons were firing too slowly, so they couldn't distinguish between two very similar sounds or be certain, if two sounds occurred close together, which was first and which was second. Often they didn't hear the beginnings of syllables or the sound changes within syllables. Normally neurons, after they have processed a sound, are ready to fire again after about a 30-millisecond rest. Eighty percent of language-impaired children took at least three times that long, so that they lost large amounts of language information. When their neuron-firing patterns were examined, the signals weren't clear. ... Improper hearing lead to weaknesses in all the language tasks, so they were weak in vocabulary, comprehension, speech, reading, and writing. Because they spent so much energy decoding words, they tended to use shorter sentences and failed to exercise their memory for longer sentences.
[Five hundred children at 35 sites] were given standardized language tests before and after Fast ForWord training. The study showed that most children's ability to understand language normalized after Fast ForWord. In many cases, their comprehension rose above normal. The average child who took the program moved ahead 1.8 years of language development in six weeks. ... A Stanford group did brain scans of twenty dyslexic children, before and after Fast ForWord. The opening scans showed that the children used different parts of their brains for reading than normal children do. After Fast ForWord new scans showed that their brains had begun to normalize.
Merzenich's team started hearing that Fast ForWord was having a number of spillover effects. Children's handwriting improved. Parents reported that many of the students were starting to show sustained attention and focus. Merzenich thought these surprising benefits were occurring because Fast ForWord led to some general improvements in mental processing.
"You know," [Merzenich] says, "IQ goes up. We used the matrix test, which is a visual-based measurement of IQ—and IQ goes up."
The fact that a visual component of the IQ went up meant that the IQ improvements were not caused simply because Fast ForWord improved the children's ability to read verbal test questions. Their mental processing was being improved in a general way.
This is just a sample of the benefits that made me want to rush right out and buy Fast ForWord, even if it were to cost as much as the insanely-expensive Rosetta Stone German software I'm also tempted to buy. From the description, it sounds like something everyone could benefit from for mental tune-ups. Unfortunately, the makers of Fast ForWord are even worse than the Rosetta Stone folks about keeping tight control over their product: as far as I've been able to determine, you can only use it under the direction of a therapist (making it too expensive for ordinary use), and even then you don't own the software but are only licensed to use it for a short period of time. :( It works, though. We know someone for whom it made all the difference in the world, even late in her school career.
Merzenich began wondering about the role of a new environmental risk factor that might affect everyone but have a more damaging effect on genetically predisposed children: the continuous background noise from machines, sometimes called white noise. White noise consists of many frequencies and is very stimulating to the auditory cortex.
"Infants are reared in continuously more noisy environments. There is always a din," he says. White noise is everywhere now, coming from fans in our electronics, air conditioners, heaters, and car engines.
To test this hypothesis, his group exposed rat pups to pulses of white noise throughout their critical period and found that the pups' cortices were devastated.
Psychologically, middle age is often an appealing time because, all else being equal, it can be a relatively placid period compared with what has come before. ... We still regard ourselves as active, but we have a tendency to deceive ourselves into thinking that we are learning as we were before. We rarely engage in tasks in which we must focus our attention as closely as we did when we were younger. Such activities as reading the newspaper, practicing a profession of many years, and speaking our own language are mostly the replay of mastered skills, not learning. By the time we hit our seventies, we may not have systematically engaged the systems in the brain that regulate plasticity for fifty years.
That's why learning a new language in old age is so good for improving and maintaining the memory generally. Because it requires intense focus, studying a new language turns on the control system for plasticity and keeps it in good shape for laying down sharp memories of all kinds. No doubt Fast ForWord is responsible for so many general improvements in thinking, in part because it stimulates the control system for plasticity to keep up its production of acetylcholine and dopamine. Anything that requires highly focused attention will help that system—learning new physical activities that require concentration, solving challenging puzzles, or making a career change that requires that you master new skills and material. Merzenich himself is an advocate of learning a new language in old age. "You will gradually sharpen everything up again and that will be very highly beneficial to you."
The same applies to mobility. Just doing the dances you learned years ago won't help your brain's motor cortex stay in shape. To keep the mind alive requires learning something truly new with intense focus. That is what will allow you to both lay down new memories and have a system that can easily access and preserve the older ones.
This work opens up the possibility of high-speed learning later in life. The nucleus basalis [always on for young children, but in adulthood only with sustained, close attention] could be turned on by an electrode, by microinjections of certain chemicals, or by drugs. It is hard to imagine that people will not ... be drawn to a technology that would make it relatively effortless to master the facts of science, history, or a profession, merely by being exposed to them briefly. ... Such techniques would no doubt be used by high school and university students in their studies and in competitive entrance exams. (Already many students who do not have attention deficit disorder use stimulants to study.) Of course, such aggressive interventions might have unanticipated, adverse effects on the brain—not to mention our ability to discipline ourselves—but they would likely be pioneered in cases of dire medical need, where people are willing to take the risk. Turning on the nucleus basalis might help brain-injured patients, so many of whom cannot relearn the lost functions of reading, writing, speaking, or walking because they can't pay close enough attention.
[Gross motor control is] a function that declines as we age, leading to loss of balance, the tendency to fall, and difficulties with mobility. Aside from the failure of vestibular processing, this decline is caused by the decrease in sensory feedback from our feet. According to Merzenich, shoes, worn for decades, limit the sensory feedback from our feet to our brain. If we went barefoot, our brains would receive many different kinds of input as we went over uneven surfaces. Shoes are a relatively flat platform that spreads out the stimuli, and the surfaces we walk on are increasingly artificial and perfectly flat. This leads us to dedifferentiate the maps for the soles of our feet and limit how touch guides our foot control. Then we may start to use canes, walkers, or crutches or rely on other senses to steady ourselves. By resorting to these compensations instead of exercising our failing brain systems, we hasten their decline.
As we age, we want to look down at our feet while walking down stairs or on slightly challenging terrain, because we're not getting much information from our feet. As Merzenich escorted his mother-in-law down the stairs of the villa, he urged her to stop looking down and start feeling her way, so that she would maintain, and develop, the sensory map for her foot, rather than letting it waste away.
Brain plasticity and psychological disorders:
Each time [people with obsessive-compulsive disorder] try to shift gears, they begin ... growing new circuits and altering the caudate. By refocusing the patient is learning not to get sucked in by the content of an obsession but to work around it. I suggest to my patients that they think of the use-it-or-lose-it principle. Each moment they spend thinking of the symptom ... they deepen the obsessive circuit. By bypassing it, they are on the road to losing it. With obsessions and compulsions, the more you do it, the more you want to do it; the less you do it, the less you want to do it ... [I]t is not what you feel while applying the technique that counts, it is what you do. "The struggle is not to make the feeling go away; the struggle is not to give in to the feeling"—by acting out a compulsion, or thinking about the obsession. This technique won't give immediate relief because lasting neuroplastic change takes time, but it does lay the groundwork for change by exercising the brain in a new way. ... The goal is to "change the channel" to some new activity for fifteen to thirty minutes when one has an OCD symptom. (If one can't resist that long, any time spent resisting is beneficial, even if it is only for a minute. That resistance, that effort, is what appears to lay down new circuits.)
Mental practice with physical results:
Pascual-Leone taught two groups of people, who had never studied piano, a sequence of notes, showing them which fingers to move and letting them hear the notes as they were played. Then members of one group, the "mental practice" group, sat in front of an electric piano keyboard, two hours a day, for five days, and imagined both playing the sequence and hearing it played. A second "physical practice" group actually played the music two hours a day for five days. Both groups had their brains mapped before the experiment, each day during it, and afterward. Then both groups were asked to play the sequence, and a computer measured the accuracy of their performances.
Pascual-Leoone found that both groups learned to play the sequence, and both showed similar brain map changes. Remarkably, mental practice alone produced the same physical changes in the motor system as actually playing the piece. By the end of the fifth day, the changes in motor signals to the muscles were the same in both groups, and the imagining players were as accurate as the actual players were on their third day.
The level of improvement at five days in the mental practice group, however substantial, was not as great as in those who did physical practice. But when the mental practice group finished its mental training and was given a single two-hour physical practice session, its overall performance improved to the level of the physical practice group's performance at five days. Clearly mental practice is an effective way to prepare for learning a physical skill with minimal physical practice.
In an experiment that is as hard to believe as it is simple, Drs. Guang Yue and Kelly Cole showed that imagining one is using one's muscles actually strengthens them. The study looked at two groups, one that did physical exercise and one that imagined doing exercise. ... At the end of the study the subjects who had done physical exercise increased their muscular strength by 30 percent, as one might expect. Those who only imagined doing the exercise, for the same period, increased their muscle strength by 22 percent. The explanation lies in the motor neurons of the brain that "program" movements. During these imaginary contractions, the neurons responsible for stringing together sequences of instructions for movements are activated and strengthened, resulting in increased strength when the muscles are contracted.
Talk about unbelievable.
The Sea Gypsies are nomadic people who live in a cluster of tropical islands in the Burmese archipelago and off the west coast of Thailand. A wandering water tribe, they learn to swim before they learn to walk and live over half their lives in boats on the open sea. ... Their children dive down, often thirty feet beneath the water's surface, and pluck up their food ... and have done so for centuries. By learning to lower their heart rate, they can stay under water twice as long as most swimmers. They do this without any diving equipment.
But what distinguishes these children, for our purposes, is that they can see clearly at these great depths, without goggles. Most human beings cannot see clearly under water because as sunlight passes through water, it is bent ... so that light doesn't land where it should on the retina.
Anna Gislén, a Swedish researcher, studied the Sea Gypsies' ability to read placards under water and found that they were more than twice as skillful as European children. The Gypsies learned to control the shape of their lenses and, more significantly, to control the size of their pupils, constricting them 22 percent. This is a remarkable finding, because human pupils reflexively get larger under water, and pupil adjustment has been thought to be a fixed, innate reflex, controlled by the brain and nervous system.
This ability of the Sea Gypsies to see under water isn't the product of a unique genetic endowment. Gislén has since taught Swedish children to constrict their pupils to see under water.
The fact that cultures differ in perception is not proof that one perceptual act is a good as the next, or that "everything is relative" when it comes to perception. Clearly some contexts call for a more narrow angle of view, and some for more wide-angle, holistic perception. The Sea Gypsies have survived using a combination of their experience of the sea and holistic perception. So attuned are they to the moods of the sea that when the tsunami of December 26, 2004, hit the Indian Ocean, killing hundreds of thousands, they all survived. They saw that the sea had begun to recede in a strange way, and this drawing back was followed by an unusually small wave; they saw dolphins begin to swim for deeper water, while the elephants started stampeding to higher ground, and they heard the cicadas fall silent. ... Long before modern science put this all together, they had either fled the sea to the shore, seeking the highest ground, or gone into very deep waters, where they also survived.
Music makes extraordinary demands on the brain. A pianist performing the eleventh variation of the Sixth Paganini etude by Franz Liszt must play a staggering eighteen hundred notes per minute. Studies by Taub and others of musicians who play stringed instruments have shown that the more these musicians practice, the larger the brain maps for their active left hands become, and the neurons and maps that respond to string timbres increase; in trumpeters the neurons and maps that respond to "brassy" sounds enlarge. Brain imaging shows that musicians have several areas of their brains—the motor cortex and the cerebellum, among others—that differ from those of nonmusicians. Imaging also shows that musicians who begin playing before the age of seven have larger brain areas connecting the two hemispheres.
It is not just "highly cultured" activities that rewire the brain. Brain scans of London taxi drivers show that the more years a cabbie spends navigating London streets, the larger the volume of his hippocampus, that part of the brain that stores spatial representations. Even leisure activities change our brain; meditators and meditation teachers have a thicker insula, a part of the cortex activated by paying close attention.
Here's something completely different, and frightening.
[T]otalitarian regimes seem to have an intuitive awareness that it becomes hard for people to change after a certain age, which is why so much effort is made to indoctrinate the young from an early age. For instance, North Korea, the most thoroughgoing totalitarian regime in existence, places children in school from ages two and a half to four years; they spend almost every waking hour being immersed in a cult of adoration for dictator Kim Jong Il and his father, Kim Il Sung. They can see their parents only on weekends. Practically every story read to them is about the leader. Forty percent of the primary school textbooks are devoted wholly to describing the two Kims. This continues all the way through school. Hatred of the enemy is drilled in with massed practice as well, so that a brain circuit forms linking the perception of "the enemy" with negative emotions automatically. A typical math quiz asks, "Three soldiers from the Korean People's Army killed thirty American soldiers. How many American soldiers were killed by each of them, if they all killed an equal number of enemy soldiers?" Such perceptual emotional networks, once established in an indoctrinated people, do not lead only to mere "differences of opinion" between them and their adversaries, but to plasticity-based anatomical differences, which are much harder to bridge or overcome than ordinary persuasion.
Think the North Koreans are the only ones whose brains are being re-programed?
"The Internet is just one of those things that contemporary humans can spend millions of 'practice' events at, that the average human a thousand years ago had absolutely no exposure to. Our brains are massively remodeled by this exposure—but so, too, by reading, by television, by video games, by modern electronics, by contemporary music, by contemporary 'tools,' etc." — Michael Merzenich, 2005
Erica Michael and Marcel Just of Carnegie Mellon University did a brain scan study to test whether the medium is indeed the message. They showed that different brain areas are involved in hearing speech and reading it, and different comprehension centers in hearing words and reading them. As Just put it, "The brain constructs the message ... differently for reading and listening. ... Listening to an audio book leaves a different set of memories than reading does. A newscast heard on the radio is processed differently from the same words read in a newspaper." This finding refutes the conventional theory of comprehension, which argues that a single center in the brain understands words, and it doesn't really matter how ... information enters the brain.
Television, music videos, and video games, all of which use television techniques, unfold at a much faster pace than real life, and they are getting faster, which causes people to develop an increased appetite for high-speed transitions in those media. It is the form of the television medium—cuts, edits, zooms, pans, and sudden noises—that alters the brain, by activating what Pavlov called the "orienting response," which occurs whenever we sense a sudden change in the world around us, especially a sudden movement. We instinctively interrupt whatever we are doing to turn, pay attention, and get our bearings. ... Television triggers this response at a far more rapid rate than we experience it in life, which is why we can't keep our eyes off the TV screen, even in the middle of an intimate conversation, and why people watch TV a lot longer than they intend. Because typical music videos, action sequences, and commercials trigger orientating responses at a rate of one per second, watching them puts us into continuous orienting response with no recovery. No wonder people report feeling drained from watching TV. Yet we acquire a taste for it and find slower changes boring. The cost is that such activities as reading, complex conversation, and listening to lectures become more difficult.
All electronic devices rewire the brain. People who write on a computer are often at a loss when they have to write by hand or dictate, because their brains are not wired to translate thoughts into cursive writing or speech at high speed. When computers crash and people have mini-nervous breakdowns, there is more than a little truth in their cry, "I feel like I've lost my mind!" As we use an electronic medium, our nervous system extends outward, and the medium extends inward.
Ouch!
"Use it or lose it" is a common refrain in The Brain that Changes Itself, whether talking about specific knowledge and abilities, or the capacity for learning and the very plasticity of the brain itself. (There is some hope given, however, that knowledge apparently lost is recoverable, even if its brain "map" has subsequently been taken over for another use.) Do you worry, as I did, that these new discoveries mean that it really is possible to learn too much, that we need to save our brains for that which is most important? That learning German will drive away what little I know of French? Relax; that doesn't need to happen, although I must be sure to keep the French fresh in my mind or it will get shelved.
As the scientist Gerald Edelman has pointed out, the human cortex alone has 30 billion neurons and is capable of making 1 million billion synaptic connections. Edelman writes, "If we considered the number of possible neural circuits, we would be dealing with hyper-astronomical numbers: 10 followed by at least a million zeros. (There are 10 followed by 79 zeros, give or take a few, of particles in the known universe.)" These staggering numbers explain why the human brain can be described as the most complex known object in the universe, and why it is capable of ongoing, massive microstructural change, and capable of performing so many different mental functions and behaviors.
I'm tired of typing. Get the book.
Permalink | Read 617 times | Comments (1)
Category Reviews: [first] [previous] [next] [newest] Education: [first] [previous] [next] [newest] Health: [first] [previous] [next] [newest] Children & Family Issues: [first] [previous] [next] [newest]
At the recent mall shooting in Greenwood, Indiana, the law enforcement response was well-trained and fast. But what kept this event from being far more tragic was a 22-year-old already on the scene and apparently sufficiently observant, calm, trained, and equipped to stop the carnage almost immediately by taking out the gunman. As Greenwood police chief Jim Ison himself said, "The real hero of the day is the citizen that was lawfully carrying a firearm in that food court and was able to stop the shooter almost as soon as he began."
Once upon a time, 22-year-olds were accustomed to doing the work of adults, managing their own families, farms, and often businesses. As I'm fond of reminding people, the famed Admiral David Farragut took command of a captured British ship in the War of 1812 at the age of 11, and was given his first command of a U. S. Navy ship at 21. With training, experience, opportunity, and higher expectations, our young people can be far more competent at life that we usually give them credit for.
A friend, who has been experiencing the dating scene, offered the following warning. I just fancied it up a bit.
Permalink | Read 555 times | Comments (0)
Category Children & Family Issues: [first] [previous] [next] [newest] Just for Fun: [first] [previous] [next] [newest]
Back in another life, I worked for the University of Rochester Medical Center. However, that was not how I met David H. Smith, the discoverer and developer of the now-common vaccine against Hemophilus influenzae b. That relationship began when my gynecologist suggested that I might want to help Dr. Smith out with his latest research project.
The following quotes are from the URMC article linked above, which recently came to my attention and inspired this post.
After training in pediatrics at Children's Hospital Medical Center in Boston, Dr. Smith served as a captain in the US Army in Japan. While a medical officer, he became the first to link chronic granulomatous disease to a deficiency in white cells. Back at Harvard, he continued his postdoctoral research in molecular genetics and bacteriology and served as chief of lnfectious Diseases at Children's Hospital from 1965 to 1976.
Harvard's legendary professor, Charles Janeway, an early researcher on the human immune system, became Smith's role model and mentor. At a time when much research focused on antibiotics, Janeway challenged his young doctors to expand their vision. At the bedside of a child enduring the agony of meningitis, Janeway said, One of you should try to find a vaccine to prevent this terrible disease. David Smith took up that challenge, and a I5-year quest was begun.
While at Harvard, he continued studying the biology and epidemiology of bacterial drug resistance factors and in 1968 began the search for a vaccine to protect against Hemophilus influenzae b., the cause of bacterial meningitis. Working in close partnership with Dr. Smith was his research colleague Porter W. Anderson, Ph.D.
In 1976, Dr. Smith was called back to the University of Rochester to chair the Department of Pediatrics. ... Dr. Smith and his research team worked flat out on the search for a Hib vaccine. By the early 1980s, the first Hib vaccine had been tested, licensed, and was being produced in a small laboratory within the medical school.
When I entered the scene, in early 1978, Smith and Anderson had a vaccine that worked for older children, but nothing to protect infants and very young children, a critical, dangerous gap. The research project that I joined was working to address the problem by vaccinating women who were hoping to become pregnant, and following their immune responses, through testing for antibodies in the mothers' blood during pregnancy, the babies' blood after birth, and also in breast milk.
It worked! I had a proper immune response, as did our child, who gained further protection through my milk.
I don't know what steps led from that study to the eventual development and acceptance of the H flu b vaccine in use today (it's now called Hib), but even though it was not yet publicly available, they—at my request—very kindly provided it to our second child, born in 1982.
Stung by the resistance of any major pharmaceutical company to buy rights to the vaccine, Dr. Smith decided to create his own pharmaceutical firm. In 1983, he resigned his chairmanship and founded Praxis Biologics. ... By 1989, Praxis had the largest number of new vaccines in clinical trials and one of the finest manufacturing facilities in North America. The initial Hib vaccine (1990) was the first vaccine to be licensed in the U.S. in a decade. The second, a conjugate vaccine, was the first to be licensed for universal use with infants since the rubella vaccine for measles and mumps.
("The rubella vaccine for measles and mumps"? Okay, we all know what they mean, but the article could have benefitted from a proofreader.)
The following is from Dr. Smith's obituary in the New York Times:
In the early 1980's, about 20,000 cases of Hib invasive disease in preschool children were reported to the Federal Centers for Disease Control. In about 12,000 of those cases, the children had meningitis, an inflammation of the brain and spinal cord membranes that can be fatal or cause permanent brain damage. In 1997, a few years after the vaccine became available for infants, 258 cases were reported.
It was a privilege to be part of that work.