YouTube is not exactly reliable when it comes to recommending videos for me to watch, but look what showed up in my sidebar tonight:
As most of my readers know, I'm a huge fan of J.R.R. Tolkien's Lord of the Rings books, but not of the movies for a number of reasons. Even though I feel the film story line and characterization are a betrayal of the spirit Tolkien put into his world, I can't deny that there are parts of the movies that are excellent, from the New Zealand setting to the music, and of course I adore this version.
Permalink | Read 51 times | Comments (1)
Category Children & Family Issues: [first] [previous] Everyday Life: [first] [previous] Just for Fun: [first] [previous] [next] [newest] Music: [first] [previous] Inspiration: [first] [previous]
There are a number of people—I certainly am one of them—who strenuously object to being unwilling medical guinea pigs in the matter of the COVID-19 vaccines.
I'm all for medical research, worked as part of a medical research team, and have been a willing human guinea pig in a few experiments myself. This work, when done carefully, knowledgeably, and ethically, is an essential part of scientific and medical advancement. But the ethically part is essential, and I don't think it's ethical to "enroll" masses of people in experiments for which there cannot possibly be adequate knowledge of the risks, and thus they cannot possibly give "informed consent." Plus, when there is no documented, adequate control group, not to mention that the experimenters have done their best to make sure there cannot be an adequate control group—well, then you've lost good science as well as ethics.
You're thinking I'm talking about the COVID-19 vaccines here, and I am—but that's not all. I don't know how many times we've been unknowingly subjected to these unethical experiments, but I do know that it has happened at least two other times in my lifetime.
Aspirin used to be the standard, go-to medication for children, even babies, with fevers or discomfort. I vividly remember the doctor recommending alternating doses of aspirin and acetaminophen when my infant daughter had a stubborn high fever. This was in the early 1980's, and for most people it worked just great. However, there appeared to be a possible correlation between aspirin use in children and young teens, in combination with a viral illness (often chicken pox), and a rare but sometimes fatal condition called Reye Syndrome. We had many doctors among our coworkers, and had no reason not to believe what they told us at the time: The decision to tell doctors and parents to avoid giving aspirin to children was a deliberate, national experiment: They thought aspirin caused Reye's Syndrome in children, but they couldn't prove it, so they hoped that if aspirin use went down dramatically, and so did the incidence of Reye, their point would be made. The disorder did, indeed, retreat significantly, whether through causation or merely correlation is still unknown. The cynic in me insists on pointing out that, whatever the stated reasons for this massive non-laboratory experiment, and whatever good might or might not come of it, one clear result was that a cheap, readily-available, and highly effective drug was massively replaced by one still under patent. The patent for acetaminophen (Tylenol) did not expire until 2007, and Tylenol was still reeling from the 1982 poisoned-Tylenol-capsules scare. Practically overnight, and with timing highly favorable to the pharmaceutical industry, Tylenol became the drug of choice for a large segment of the population.
The next example I remember of such a huge, non-controlled experiment happened in the early 1990's, and was not a drug but a parenting practice: the insistence by the medical profession that all babies never be allowed to sleep on their stomachs. Sleep position recommendations have flip-flopped several times over the years. The professionals never think it safe to leave that decision up to the babies and their parents, they just keep changing what it is that is "the only safe way for a baby to sleep." Personally, I think "whatever helps the baby sleep best" is almost always the right choice. (But I am not a doctor, nor any other medical professional, so make your own choices and don't sue me.)
Early in the 1990's the thought was that back-sleeping might reduce the incidence of Sudden Infant Death Syndrome (SIDS). Indeed, there was a decline after the "Back to Sleep" push went into effect, though once again the experiment was unscientific with no significant control group. Certainly there were still parents who put their babies to sleep on their stomachs, but if there was any widespread study of them I never heard of it, and indeed the data was necessarily corrupted because the pressure was so great not to do so that few parents talked openly about it. And doctors, even if they were well aware of the advantages of stomach-sleeping, could not risk mentioning them to their patients. I remember vividly the one young mother who, months later, confessed to the pediatrician that her son had always slept on his stomach. The doctor laughed, saying, "Of course I knew that! Look at how advanced he is, and look at the perfect shape of his head!" But stomach-sleeping is still very much a "don't ask, don't tell" situation.
These massive, uncontrolled, and to my mind unethical experiments on the human population are justified in the minds of many because, after all, they "did their job." Deaths from Reye Syndrome, SIDS, and COVID-19 have all fallen, so who cares how we got there?
Well, I care—and so should anyone who believes in the scientific method, the Hippocratic Oath, and open, honest, and ethical research.
Permalink | Read 133 times | Comments (0)
Category Hurricanes and Such: [first] [previous] [next] [newest] Health: [first] [previous] [newest] Politics: [first] [previous] [next] [newest] Children & Family Issues: [first] [previous] [newest]
Something unusual happened in our water aerobics class.
I had fun. I had fun participating in something resembling a sport.
So what? Well, here's the big deal: I don't think that has happened since elementary school.
I loved physical activity back then. Sports, even. Soccer, kickball, dodge ball, volley ball, gymnastics, trampoline. I even enjoyed the since-much-maligned Presidents Physical Fitness test. I was one of the best in school at swarming up a rope to the ceiling. After school, the neighborhood kids played active games, usually until dark. I was reasonably strong and fit—most children were, in those days—and loved active play.
What happened? Don't say I got old, or busy, though of course I did both. Don't blame it on phones or computers; this was long before these became part of my life.
Physical activity changed. Sports changed. Most people adapted; I didn't.
Back in my day, soccer wasn't the organized sport it is today for even the youngest. We had goals, we had a ball, we had a few basic rules (e.g. "no hands"), and we had a gaggle of kids roughly organized into two "teams." What we did, what I loved, was to run madly up and down the field, trying to kick the ball into the goal. Except for goalie, there were no assigned positions; it was literally a free-for-all. No one today would deign to call it soccer. But it sure was fun.
Volleyball was similar. Again, we had two teams—their composition always changing—a net, a ball, and a few basic rules. But no assigned positions. Serving, but little setting. Just a madcap "let's hit the ball over the net." And I loved it.
For many other people, the eventual organization of sports, honing of skills, multiplication of rules and tactics, and emphasis on competition made the games more fun. The rest of us, I guess, simply dropped out, to the detriment of both our physical and our mental health.
Which is why I was so excited when our instructor suddenly decided that Thursdays would be play days. She gave us small beach balls, and paddles, and organized us very loosely in games of no recognizable sport, but which—in groups, in pairs, and individually—challenged us to use our muscles in ways we hadn't used in a long time: reaching, jumping, running; increasing our strength, agility, and hand-eye coordination—all those things that sports are good for.
Perhaps best of all, when we played together, we became people to each other, not just a group of individuals gathered for healthful exercise. We looked at each other, we made eye contact, we worked together to make sure everyone was included and benefitting.
I was a kid again.
The following is a Dark Horse clip about the significant increase in myopia in children, as reported in this Atlantic article. Bret and Heather have issues with the article, but confirm the myopia problem and have their own theories about it. And, at the end, about orthodontia. It's 30 minutes long—and there's a section in the middle where they spend maybe too much time on the concept of "heritability"—so if you can stand it, you may want to speed up the playback. But I highly recommend watching the video, particularly to parents who are concerned about their children's eyes and teeth. I guess that would be all parents....
As I've said before, Bret and Heather are not always right, and sometimes dangerously wrong. But they are always interesting, and impressive in their quest for truth and their willingness to follow where it leads them, regardless of the popularity of their opinions.
Since COVID isn't so much of a problem in New York City anymore, Mayor Eric Adams and New York City Health & Hospitals CEO Dr. Mitchell Katz have come up with a new way to terrorize those who must be admitted to a Big Apple hospital. At the moment, it's just three facilities: H+H/Lincoln, Metropolitan, and Woodhull Hospitals, but it's feared the contagion may spread.
If you're unfortunate enough to be admitted to one of those hospitals, keep an eye on your dinner plate.
Culturally diverse plant-based meals are now the primary dinner options for inpatients.
Don't panic, NYC residents and visitors. I'm here to reassure you that this problem is not actually new, and there are ways around it.
Back in the mid-1980's, when we moved to Florida, we were warned that our local hospital was run by Seventh-Day Adventists, and consequently meat was never on the menu. The solution, we were told, was to be sure that your doctor provided you with a prescription for meat. I have no idea if making it a prescription increased the cost of meals fifty-fold, or if any insurance plans covered it. But we were assured that the hospital honored the doctors' orders, and the kitchen staff even did a better-than-usual job of preparing the special meals.
Apparently the same work-around will be honored in New York.
Non-plant-based options continue to be available and are offered in accordance with a patient’s prescribed diet.
Choose your doctor well.
Permalink | Read 104 times | Comments (0)
Category Health: [first] [previous] [next] [newest] Politics: [first] [previous] [next] [newest] Children & Family Issues: [first] [previous] [next] [newest] Food: [first] [previous] [next] [newest]
It started innocently enough, with an e-mail from AncestryDNA informing me of an additional trait revealed by my genetic makeup: my inclination to seek out or to avoid risky behavior. I could have predicted the result: I definitely prefer to avoid risk. Except, of course, that if I were as risk-averse as they say I am, I wouldn't be about to write something that could get me cancelled by Facebook.
The trigger was in one of Ancestry's explanatory paragraphs:
The world around you also affects your appetite for risk. Younger people and folks who were assigned male at birth report taking the most risks, which may be influenced by environmental factors like social rewards. Some influences are closer to home, like whether your parents encouraged risk taking. Also, our popular understandings of risk may skew more toward physical and financial risks than emotional ones. It's not only risky to do things like step onto a tightrope blindfolded. It's also risky to be honest about your feelings, admit ignorance, and express disagreement. In other words, it's risky to be yourself.
Really, Ancestry? Folks who were assigned male at birth? You mean men? If there's one place I'd expect to be free from this massacre of language, not to mention of reality, it would be a company that makes its money telling people about their chromosomes. When the attendants at my birth announced, "It's a girl!" they were not assigning my sex, they were revealing it, and AncestryDNA should know that better than anybody. Is there any point in trusting the other things they say about my genetics if they think that whether I was born with XX or XY chromosomes is something that was chosen by the birth attendants? Maybe the doctor determined my skin color, too? And the nurse decided I would be right-handed? Humbug.
When American women began coloring their hair, the object was to appear natural (no purple!). Clairol's popular commercial advertising their product contained the catchphrase, "Only her hairdresser knows for sure." My ophthalmologist amended that to, "... and her eye doctor."
While examining my eyes, he had casually announced, "You're actually a blonde." My hair, at that time, was brown, with a smattering of grey. All natural, I might add. A towhead as a child, I had gradually morphed into a brunette. Or so I thought.
"How do you know that?" I questioned.
"You have a blonde fundus. You can dye your hair and fool most people, but your eyes know the truth."
In the meantime, two of the Green Ember books are currently free for Kindle, with more to come next week. But really, the regular Kindle prices are so low, it's not worth stressing of you miss the sales.
I've been writing these essays for more than 20 years. As with all writers (and other artists), I often look back on my work and shudder. Sometimes, however, I'm okay with what I've written. But how often does someone see a blog post from 2010? Current events may not be relevant anymore and can reasonably be forgotten, and most people don't care about our everyday lives. But I've also said a lot that I think bears repeating; book reviews, for example, are almost always just as useful now as they were then. So I'm going to start to bring back some of my favorites, not only because I believe they'll be useful to what is mostly a whole new audience, but also because I need to be reminded of the content myself.
I'll begin with the fascinating, and important, idea of neuroplasticity, which I first wrote about on May 18, 2010.
The Brain that Changes Itself: Stories of Personal Triumph from the Frontiers of Brain Science by Norman Doidge (Penguin, New York, 2007)
The idea that our brains are fixed, hard-wired machines was (and in many cases still is) so deeply entrenched in the scientific establishment that evidence to the contrary was not only suppressed, but often not even seen because the minds of even respectable scientists could not absorb what they were certain was impossible. Having been familiar since the 1960s with the work of Glenn Doman and the Institutes for the Achievement of Human Potential, the idea that the human brain is continually changing itself and can recover from injury in astonishing ways did not surprise me. In fact, the only shock was that in a 400 page book on neuroplasticity and the persecution of its early pioneers I found not one mention of Doman's name. But the stories are none the less astonishing for that.
In Chapter 1 we meet a woman whose vestibular system was destroyed by antibiotic side-effects. She is freed by a sensor held on her tongue and a computerized helmet from the severely disabling feeling that she is falling all the time, even when lying flat. That's the stuff of science fiction, but what's most astounding is that the effect lingers for a few minutes after she removes the apparatus the first time, and after several sessions she no longer needs the device.
Chapter 3, "Redesigning the Brain," on the work of Michael Merzenich, including the ground-breaking Fast ForWord learning program, is worth the cost of the book all by itself.
Sensitive readers may want to steer clear of Chapter 4, "Acquiring Tastes and Loves," or risk being left with unwanted, disturbing mental images. But it is a must read for anyone who wants to believe that pornography is harmless, or that our personal, private mental fantasies do not adversely affect the very structure of our brains.
The book is less impressive when it gets away from hard science and into psychotherapy, as the ideas become more speculative, but the stories are still impressive.
Phantom pain, learning disabilities, autism, stroke recovery, obsessions and compulsions, age-related mental decline, and much more: the discovery of neuroplasticity shatters misconceptions and offers hope. The Brain that Changes Itself is an appetizer plate; bring on the main course!
For those who want a sampling of the appetizer itself, I'm including an extensive quotation section. Even so, it doesn't come close to doing justice to the depth and especially the breadth of the book. I've pulled quotes from all over, so understand that they are out of context, and don't expect them to move smoothly from one section to another.
Neuro is for "neuron," the nerve cells in our brains and nervous systems. Plastic is for "changeable, malleable, modifiable." At first many of the scientists didn't dare use the word "neuroplasticity" in their publications, and their peers belittled them for promoting a fanciful notions. Yet they persisted, slowly overturning the doctrine of the unchanging brain. They showed that children are not always stuck with the mental abilities they are born with; that the damaged brain can often reorganize itself so that when one part fails, another can often substitute; that if brain cells die, they can at times be replaced; that many "circuits" and even basic reflexes that we think are hardwired are not. One of these scientists even showed that thinking, learning, and acting can turn our genes on or off, thus shaping our brain anatomy and our behavior—surely one of the most extraordinary discoveries of the twentieth century.
In the course of my travels I met a scientist who enabled people who had been blind since birth to begin to see, another who enabled the deaf to hear; I spoke with people who had had strokes decades before and had been declared incurable, who were helped to recover with neuroplastic treatments; I met people whose learning disorders were cured and whose IQs were raised; I saw evidence that it is possible for eighty-year-olds to sharpen their memories to function the way they did when they were fifty-five. I saw people rewire their brains with their thoughts, to cure previously incurable obsessions and traumas. I spoke with Nobel laureates who were hotly debating how we must re-think our model of the brain now that we know it is ever changing. ... The idea that the brain can change its own structure and function through thought and activity is, I believe, the most important alteration in our view of the brain since we first sketched out its basic anatomy and the workings of its basic component, the neuron.
In Chapter 2, Building Herself a Better Brain, a woman with such a severe imbalance of brain function that she was labelled mentally retarded put her own experiences together with the work of other researchers to design brain exercises that fixed the weaknesses in her own brain ... and went on to develop similar diagnostic procedures and exercises for others.
The irony of this new discovery is that for hundreds of years educators did seem to sense that children's brains had to be built up through exercises of increasing difficulty that strengthened brain functions. Up through the nineteenth and early twentieth centuries a classical education often included rote memorization of long poems in foreign languages, which strengthened the auditory memory (hence thinking in language) and an almost fanatical attention to handwriting, which probably helped strengthen motor capacities and thus not only helped handwriting but added speed and fluency to reading and speaking. Often a great deal of attention was paid to exact elocution and to perfecting the pronunciation of words. Then in the 1960s educators dropped such traditional exercises from the curriculum, because they were too rigid, boring, and "not relevant." But the loss of these drills has been costly; they may have been the only opportunity that many students had to systematically exercise the brain function that gives us fluency and grace with symbols. For the rest of us, their disappearance may have contributed to the general decline of eloquence, which requires memory and a level of auditory brainpower unfamiliar to us now. In the Lincoln-Douglas debates of 1858 the debaters would comfortably speak for an hour or more without notes, in extended memorized paragraphs; today many of the most learned among us, raised in our most elite schools since the 1960s, prefer the omnipresent PowerPoint presentation—the ultimate compensation for a weak premotor cortex.
Here are several (but not enough!) from my favorite chapter, "Redesigning the Brain."
[As] they trained an animal at a skill, not only did its neurons fire faster, but because they were faster their signals were clearer. Faster neurons were more likely to fire in sync with each other—becoming better team players—wiring together more and forming groups of neurons that gave off clearer and more powerful signals. This is a crucial point, because a powerful signal has greater impact on the brain. When we want to remember something we have heard we must hear it clearly, because a memory can be only as clear as its original signal.
Paying close attention is essential to long-term plastic change. ... When the animals performed tasks automatically, without paying attention, they changed their brain maps, but the changes did not last. We often praise "the ability to multitask." While you can learn when you divide your attention, divided attention doesn't lead to abiding change in your brain maps.
Somewhere between 5 and 10 percent of preschool children have a language disability that makes it difficult for them to read, write, or even follow instructions. ... [C]hildren with language disabilities have auditory processing problems with common consonant-vowel combinations that are spoken quickly and are called "the fast parts of speech." The children have trouble hearing them accurately and, as a result, reproducing them accurately. Merzenich believed that these children's auditory cortex neurons were firing too slowly, so they couldn't distinguish between two very similar sounds or be certain, if two sounds occurred close together, which was first and which was second. Often they didn't hear the beginnings of syllables or the sound changes within syllables. Normally neurons, after they have processed a sound, are ready to fire again after about a 30-millisecond rest. Eighty percent of language-impaired children took at least three times that long, so that they lost large amounts of language information. When their neuron-firing patterns were examined, the signals weren't clear. ... Improper hearing lead to weaknesses in all the language tasks, so they were weak in vocabulary, comprehension, speech, reading, and writing. Because they spent so much energy decoding words, they tended to use shorter sentences and failed to exercise their memory for longer sentences.
[Five hundred children at 35 sites] were given standardized language tests before and after Fast ForWord training. The study showed that most children's ability to understand language normalized after Fast ForWord. In many cases, their comprehension rose above normal. The average child who took the program moved ahead 1.8 years of language development in six weeks. ... A Stanford group did brain scans of twenty dyslexic children, before and after Fast ForWord. The opening scans showed that the children used different parts of their brains for reading than normal children do. After Fast ForWord new scans showed that their brains had begun to normalize.
Merzenich's team started hearing that Fast ForWord was having a number of spillover effects. Children's handwriting improved. Parents reported that many of the students were starting to show sustained attention and focus. Merzenich thought these surprising benefits were occurring because Fast ForWord led to some general improvements in mental processing.
"You know," [Merzenich] says, "IQ goes up. We used the matrix test, which is a visual-based measurement of IQ—and IQ goes up."
The fact that a visual component of the IQ went up meant that the IQ improvements were not caused simply because Fast ForWord improved the children's ability to read verbal test questions. Their mental processing was being improved in a general way.
This is just a sample of the benefits that made me want to rush right out and buy Fast ForWord, even if it were to cost as much as the insanely-expensive Rosetta Stone German software I'm also tempted to buy. From the description, it sounds like something everyone could benefit from for mental tune-ups. Unfortunately, the makers of Fast ForWord are even worse than the Rosetta Stone folks about keeping tight control over their product: as far as I've been able to determine, you can only use it under the direction of a therapist (making it too expensive for ordinary use), and even then you don't own the software but are only licensed to use it for a short period of time. :( It works, though. We know someone for whom it made all the difference in the world, even late in her school career.
Merzenich began wondering about the role of a new environmental risk factor that might affect everyone but have a more damaging effect on genetically predisposed children: the continuous background noise from machines, sometimes called white noise. White noise consists of many frequencies and is very stimulating to the auditory cortex.
"Infants are reared in continuously more noisy environments. There is always a din," he says. White noise is everywhere now, coming from fans in our electronics, air conditioners, heaters, and car engines.
To test this hypothesis, his group exposed rat pups to pulses of white noise throughout their critical period and found that the pups' cortices were devastated.
Psychologically, middle age is often an appealing time because, all else being equal, it can be a relatively placid period compared with what has come before. ... We still regard ourselves as active, but we have a tendency to deceive ourselves into thinking that we are learning as we were before. We rarely engage in tasks in which we must focus our attention as closely as we did when we were younger. Such activities as reading the newspaper, practicing a profession of many years, and speaking our own language are mostly the replay of mastered skills, not learning. By the time we hit our seventies, we may not have systematically engaged the systems in the brain that regulate plasticity for fifty years.
That's why learning a new language in old age is so good for improving and maintaining the memory generally. Because it requires intense focus, studying a new language turns on the control system for plasticity and keeps it in good shape for laying down sharp memories of all kinds. No doubt Fast ForWord is responsible for so many general improvements in thinking, in part because it stimulates the control system for plasticity to keep up its production of acetylcholine and dopamine. Anything that requires highly focused attention will help that system—learning new physical activities that require concentration, solving challenging puzzles, or making a career change that requires that you master new skills and material. Merzenich himself is an advocate of learning a new language in old age. "You will gradually sharpen everything up again and that will be very highly beneficial to you."
The same applies to mobility. Just doing the dances you learned years ago won't help your brain's motor cortex stay in shape. To keep the mind alive requires learning something truly new with intense focus. That is what will allow you to both lay down new memories and have a system that can easily access and preserve the older ones.
This work opens up the possibility of high-speed learning later in life. The nucleus basalis [always on for young children, but in adulthood only with sustained, close attention] could be turned on by an electrode, by microinjections of certain chemicals, or by drugs. It is hard to imagine that people will not ... be drawn to a technology that would make it relatively effortless to master the facts of science, history, or a profession, merely by being exposed to them briefly. ... Such techniques would no doubt be used by high school and university students in their studies and in competitive entrance exams. (Already many students who do not have attention deficit disorder use stimulants to study.) Of course, such aggressive interventions might have unanticipated, adverse effects on the brain—not to mention our ability to discipline ourselves—but they would likely be pioneered in cases of dire medical need, where people are willing to take the risk. Turning on the nucleus basalis might help brain-injured patients, so many of whom cannot relearn the lost functions of reading, writing, speaking, or walking because they can't pay close enough attention.
[Gross motor control is] a function that declines as we age, leading to loss of balance, the tendency to fall, and difficulties with mobility. Aside from the failure of vestibular processing, this decline is caused by the decrease in sensory feedback from our feet. According to Merzenich, shoes, worn for decades, limit the sensory feedback from our feet to our brain. If we went barefoot, our brains would receive many different kinds of input as we went over uneven surfaces. Shoes are a relatively flat platform that spreads out the stimuli, and the surfaces we walk on are increasingly artificial and perfectly flat. This leads us to dedifferentiate the maps for the soles of our feet and limit how touch guides our foot control. Then we may start to use canes, walkers, or crutches or rely on other senses to steady ourselves. By resorting to these compensations instead of exercising our failing brain systems, we hasten their decline.
As we age, we want to look down at our feet while walking down stairs or on slightly challenging terrain, because we're not getting much information from our feet. As Merzenich escorted his mother-in-law down the stairs of the villa, he urged her to stop looking down and start feeling her way, so that she would maintain, and develop, the sensory map for her foot, rather than letting it waste away.
Brain plasticity and psychological disorders:
Each time [people with obsessive-compulsive disorder] try to shift gears, they begin ... growing new circuits and altering the caudate. By refocusing the patient is learning not to get sucked in by the content of an obsession but to work around it. I suggest to my patients that they think of the use-it-or-lose-it principle. Each moment they spend thinking of the symptom ... they deepen the obsessive circuit. By bypassing it, they are on the road to losing it. With obsessions and compulsions, the more you do it, the more you want to do it; the less you do it, the less you want to do it ... [I]t is not what you feel while applying the technique that counts, it is what you do. "The struggle is not to make the feeling go away; the struggle is not to give in to the feeling"—by acting out a compulsion, or thinking about the obsession. This technique won't give immediate relief because lasting neuroplastic change takes time, but it does lay the groundwork for change by exercising the brain in a new way. ... The goal is to "change the channel" to some new activity for fifteen to thirty minutes when one has an OCD symptom. (If one can't resist that long, any time spent resisting is beneficial, even if it is only for a minute. That resistance, that effort, is what appears to lay down new circuits.)
Mental practice with physical results:
Pascual-Leone taught two groups of people, who had never studied piano, a sequence of notes, showing them which fingers to move and letting them hear the notes as they were played. Then members of one group, the "mental practice" group, sat in front of an electric piano keyboard, two hours a day, for five days, and imagined both playing the sequence and hearing it played. A second "physical practice" group actually played the music two hours a day for five days. Both groups had their brains mapped before the experiment, each day during it, and afterward. Then both groups were asked to play the sequence, and a computer measured the accuracy of their performances.
Pascual-Leoone found that both groups learned to play the sequence, and both showed similar brain map changes. Remarkably, mental practice alone produced the same physical changes in the motor system as actually playing the piece. By the end of the fifth day, the changes in motor signals to the muscles were the same in both groups, and the imagining players were as accurate as the actual players were on their third day.
The level of improvement at five days in the mental practice group, however substantial, was not as great as in those who did physical practice. But when the mental practice group finished its mental training and was given a single two-hour physical practice session, its overall performance improved to the level of the physical practice group's performance at five days. Clearly mental practice is an effective way to prepare for learning a physical skill with minimal physical practice.
In an experiment that is as hard to believe as it is simple, Drs. Guang Yue and Kelly Cole showed that imagining one is using one's muscles actually strengthens them. The study looked at two groups, one that did physical exercise and one that imagined doing exercise. ... At the end of the study the subjects who had done physical exercise increased their muscular strength by 30 percent, as one might expect. Those who only imagined doing the exercise, for the same period, increased their muscle strength by 22 percent. The explanation lies in the motor neurons of the brain that "program" movements. During these imaginary contractions, the neurons responsible for stringing together sequences of instructions for movements are activated and strengthened, resulting in increased strength when the muscles are contracted.
Talk about unbelievable.
The Sea Gypsies are nomadic people who live in a cluster of tropical islands in the Burmese archipelago and off the west coast of Thailand. A wandering water tribe, they learn to swim before they learn to walk and live over half their lives in boats on the open sea. ... Their children dive down, often thirty feet beneath the water's surface, and pluck up their food ... and have done so for centuries. By learning to lower their heart rate, they can stay under water twice as long as most swimmers. They do this without any diving equipment.
But what distinguishes these children, for our purposes, is that they can see clearly at these great depths, without goggles. Most human beings cannot see clearly under water because as sunlight passes through water, it is bent ... so that light doesn't land where it should on the retina.
Anna Gislén, a Swedish researcher, studied the Sea Gypsies' ability to read placards under water and found that they were more than twice as skillful as European children. The Gypsies learned to control the shape of their lenses and, more significantly, to control the size of their pupils, constricting them 22 percent. This is a remarkable finding, because human pupils reflexively get larger under water, and pupil adjustment has been thought to be a fixed, innate reflex, controlled by the brain and nervous system.
This ability of the Sea Gypsies to see under water isn't the product of a unique genetic endowment. Gislén has since taught Swedish children to constrict their pupils to see under water.
The fact that cultures differ in perception is not proof that one perceptual act is a good as the next, or that "everything is relative" when it comes to perception. Clearly some contexts call for a more narrow angle of view, and some for more wide-angle, holistic perception. The Sea Gypsies have survived using a combination of their experience of the sea and holistic perception. So attuned are they to the moods of the sea that when the tsunami of December 26, 2004, hit the Indian Ocean, killing hundreds of thousands, they all survived. They saw that the sea had begun to recede in a strange way, and this drawing back was followed by an unusually small wave; they saw dolphins begin to swim for deeper water, while the elephants started stampeding to higher ground, and they heard the cicadas fall silent. ... Long before modern science put this all together, they had either fled the sea to the shore, seeking the highest ground, or gone into very deep waters, where they also survived.
Music makes extraordinary demands on the brain. A pianist performing the eleventh variation of the Sixth Paganini etude by Franz Liszt must play a staggering eighteen hundred notes per minute. Studies by Taub and others of musicians who play stringed instruments have shown that the more these musicians practice, the larger the brain maps for their active left hands become, and the neurons and maps that respond to string timbres increase; in trumpeters the neurons and maps that respond to "brassy" sounds enlarge. Brain imaging shows that musicians have several areas of their brains—the motor cortex and the cerebellum, among others—that differ from those of nonmusicians. Imaging also shows that musicians who begin playing before the age of seven have larger brain areas connecting the two hemispheres.
It is not just "highly cultured" activities that rewire the brain. Brain scans of London taxi drivers show that the more years a cabbie spends navigating London streets, the larger the volume of his hippocampus, that part of the brain that stores spatial representations. Even leisure activities change our brain; meditators and meditation teachers have a thicker insula, a part of the cortex activated by paying close attention.
Here's something completely different, and frightening.
[T]otalitarian regimes seem to have an intuitive awareness that it becomes hard for people to change after a certain age, which is why so much effort is made to indoctrinate the young from an early age. For instance, North Korea, the most thoroughgoing totalitarian regime in existence, places children in school from ages two and a half to four years; they spend almost every waking hour being immersed in a cult of adoration for dictator Kim Jong Il and his father, Kim Il Sung. They can see their parents only on weekends. Practically every story read to them is about the leader. Forty percent of the primary school textbooks are devoted wholly to describing the two Kims. This continues all the way through school. Hatred of the enemy is drilled in with massed practice as well, so that a brain circuit forms linking the perception of "the enemy" with negative emotions automatically. A typical math quiz asks, "Three soldiers from the Korean People's Army killed thirty American soldiers. How many American soldiers were killed by each of them, if they all killed an equal number of enemy soldiers?" Such perceptual emotional networks, once established in an indoctrinated people, do not lead only to mere "differences of opinion" between them and their adversaries, but to plasticity-based anatomical differences, which are much harder to bridge or overcome than ordinary persuasion.
Think the North Koreans are the only ones whose brains are being re-programed?
"The Internet is just one of those things that contemporary humans can spend millions of 'practice' events at, that the average human a thousand years ago had absolutely no exposure to. Our brains are massively remodeled by this exposure—but so, too, by reading, by television, by video games, by modern electronics, by contemporary music, by contemporary 'tools,' etc." — Michael Merzenich, 2005
Erica Michael and Marcel Just of Carnegie Mellon University did a brain scan study to test whether the medium is indeed the message. They showed that different brain areas are involved in hearing speech and reading it, and different comprehension centers in hearing words and reading them. As Just put it, "The brain constructs the message ... differently for reading and listening. ... Listening to an audio book leaves a different set of memories than reading does. A newscast heard on the radio is processed differently from the same words read in a newspaper." This finding refutes the conventional theory of comprehension, which argues that a single center in the brain understands words, and it doesn't really matter how ... information enters the brain.
Television, music videos, and video games, all of which use television techniques, unfold at a much faster pace than real life, and they are getting faster, which causes people to develop an increased appetite for high-speed transitions in those media. It is the form of the television medium—cuts, edits, zooms, pans, and sudden noises—that alters the brain, by activating what Pavlov called the "orienting response," which occurs whenever we sense a sudden change in the world around us, especially a sudden movement. We instinctively interrupt whatever we are doing to turn, pay attention, and get our bearings. ... Television triggers this response at a far more rapid rate than we experience it in life, which is why we can't keep our eyes off the TV screen, even in the middle of an intimate conversation, and why people watch TV a lot longer than they intend. Because typical music videos, action sequences, and commercials trigger orientating responses at a rate of one per second, watching them puts us into continuous orienting response with no recovery. No wonder people report feeling drained from watching TV. Yet we acquire a taste for it and find slower changes boring. The cost is that such activities as reading, complex conversation, and listening to lectures become more difficult.
All electronic devices rewire the brain. People who write on a computer are often at a loss when they have to write by hand or dictate, because their brains are not wired to translate thoughts into cursive writing or speech at high speed. When computers crash and people have mini-nervous breakdowns, there is more than a little truth in their cry, "I feel like I've lost my mind!" As we use an electronic medium, our nervous system extends outward, and the medium extends inward.
"Use it or lose it" is a common refrain in The Brain that Changes Itself, whether talking about specific knowledge and abilities, or the capacity for learning and the very plasticity of the brain itself. (There is some hope given, however, that knowledge apparently lost is recoverable, even if its brain "map" has subsequently been taken over for another use.) Do you worry, as I did, that these new discoveries mean that it really is possible to learn too much, that we need to save our brains for that which is most important? That learning German will drive away what little I know of French? Relax; that doesn't need to happen, although I must be sure to keep the French fresh in my mind or it will get shelved.
As the scientist Gerald Edelman has pointed out, the human cortex alone has 30 billion neurons and is capable of making 1 million billion synaptic connections. Edelman writes, "If we considered the number of possible neural circuits, we would be dealing with hyper-astronomical numbers: 10 followed by at least a million zeros. (There are 10 followed by 79 zeros, give or take a few, of particles in the known universe.)" These staggering numbers explain why the human brain can be described as the most complex known object in the universe, and why it is capable of ongoing, massive microstructural change, and capable of performing so many different mental functions and behaviors.
I'm tired of typing. Get the book.
Permalink | Read 168 times | Comments (1)
Category Reviews: [first] [previous] [next] [newest] Education: [first] [previous] [next] [newest] Health: [first] [previous] [next] [newest] Children & Family Issues: [first] [previous] [next] [newest]
At the recent mall shooting in Greenwood, Indiana, the law enforcement response was well-trained and fast. But what kept this event from being far more tragic was a 22-year-old already on the scene and apparently sufficiently observant, calm, trained, and equipped to stop the carnage almost immediately by taking out the gunman. As Greenwood police chief Jim Ison himself said, "The real hero of the day is the citizen that was lawfully carrying a firearm in that food court and was able to stop the shooter almost as soon as he began."
Once upon a time, 22-year-olds were accustomed to doing the work of adults, managing their own families, farms, and often businesses. As I'm fond of reminding people, the famed Admiral David Farragut took command of a captured British ship in the War of 1812 at the age of 11, and was given his first command of a U. S. Navy ship at 21. With training, experience, opportunity, and higher expectations, our young people can be far more competent at life that we usually give them credit for.
A friend, who has been experiencing the dating scene, offered the following warning. I just fancied it up a bit.
Back in another life, I worked for the University of Rochester Medical Center. However, that was not how I met David H. Smith, the discoverer and developer of the now-common vaccine against Hemophilus influenzae b. That relationship began when my gynecologist suggested that I might want to help Dr. Smith out with his latest research project.
The following quotes are from the URMC article linked above, which recently came to my attention and inspired this post.
After training in pediatrics at Children's Hospital Medical Center in Boston, Dr. Smith served as a captain in the US Army in Japan. While a medical officer, he became the first to link chronic granulomatous disease to a deficiency in white cells. Back at Harvard, he continued his postdoctoral research in molecular genetics and bacteriology and served as chief of lnfectious Diseases at Children's Hospital from 1965 to 1976.
Harvard's legendary professor, Charles Janeway, an early researcher on the human immune system, became Smith's role model and mentor. At a time when much research focused on antibiotics, Janeway challenged his young doctors to expand their vision. At the bedside of a child enduring the agony of meningitis, Janeway said, One of you should try to find a vaccine to prevent this terrible disease. David Smith took up that challenge, and a I5-year quest was begun.
While at Harvard, he continued studying the biology and epidemiology of bacterial drug resistance factors and in 1968 began the search for a vaccine to protect against Hemophilus influenzae b., the cause of bacterial meningitis. Working in close partnership with Dr. Smith was his research colleague Porter W. Anderson, Ph.D.
In 1976, Dr. Smith was called back to the University of Rochester to chair the Department of Pediatrics. ... Dr. Smith and his research team worked flat out on the search for a Hib vaccine. By the early 1980s, the first Hib vaccine had been tested, licensed, and was being produced in a small laboratory within the medical school.
When I entered the scene, in early 1979, Smith and Anderson had a vaccine that worked for older children, but nothing to protect infants and very young children, a critical, dangerous gap. The research project that I joined was working to address the problem by vaccinating women who were hoping to become pregnant, and following their immune responses, through testing for antibodies in the mothers' blood during pregnancy, the babies' blood after birth, and also in breast milk.
It worked! I had a proper immune response, as did our child, who gained further protection through my milk.
I don't know what steps led from that study to the eventual development and acceptance of the H flu b vaccine in use today (it's now called Hib), but even though it was not yet publicly available, they—at my request—very kindly provided it to our second child, born in 1982.
Stung by the resistance of any major pharmaceutical company to buy rights to the vaccine, Dr. Smith decided to create his own pharmaceutical firm. In 1983, he resigned his chairmanship and founded Praxis Biologics. ... By 1989, Praxis had the largest number of new vaccines in clinical trials and one of the finest manufacturing facilities in North America. The initial Hib vaccine (1990) was the first vaccine to be licensed in the U.S. in a decade. The second, a conjugate vaccine, was the first to be licensed for universal use with infants since the rubella vaccine for measles and mumps.
("The rubella vaccine for measles and mumps"? Okay, we all know what they mean, but the article could have benefitted from a proofreader.)
The following is from Dr. Smith's obituary in the New York Times:
In the early 1980's, about 20,000 cases of Hib invasive disease in preschool children were reported to the Federal Centers for Disease Control. In about 12,000 of those cases, the children had meningitis, an inflammation of the brain and spinal cord membranes that can be fatal or cause permanent brain damage. In 1997, a few years after the vaccine became available for infants, 258 cases were reported.
It was a privilege to be part of that work.
When our kids were in school, individual learning styles were becoming a big thing. They were each given a test that supposedly categorized them as Visual, Auditory, or Kinesthetic learners. We could see some truth in the results, though the skeptic in me wasn't sure it had any more basis in reality than finding some truth in the Chinese Zodiac descriptions you see on the placemats in cheap Chinese restaurants.
Here's a presentation that agrees with my cynical view (15 minutes):
The contrast between the enormous popularity of the learning styles approach within education, and the lack of credible evidence for its utility is, in our opinion, striking and disturbing.
There's a large body of literature that supports the claim that everyone learns better with multi-modal approaches, where words and pictures are presented together, rather than either words or pictures alone.
Ultimately, the most important thing for learning is not the way the information is presented, but what is happening inside the learner's head. People learn best when they're actively thinking about the material, solving problems, or imagining what happens if different variables change.
I call this just another bit of evidence for the harm we do when we label people. You're a kinesthetic learner, she has ADHD, he's "on the spectrum," I suffer from face blindness.... Sometimes labels can help us understand ourselves better, but more often they encourage us (and our parents, teachers, employers) to shut ourselves up in boxes and put limits on our abilities.
My constant prayer during Pandemic-tide has been that we would learn to think outside our traditional, largely unquestioned, boxes of life. And so we have.
Many more workers—and their employers—have discovered that remote work can be a good thing. This is not new; back in the day we called it "telecommuting" and it came with both blessings (work from anywhere at any time) and curses (work from everywhere all the time). But, thanks to the pandemic restrictions, the number of people exercising this option has grown to where it's having a significant effect on the demographics of the country. Just ask the citizens of New Hampshire, whose real estate prices have been driven through the roof by pressure from Boston- and New York City-dwellers who no longer need to live in an expensive city to work there. Again: blessings and curses.
More exciting to me is the surge in home education.
A friend sent me this Associated Press article from mid-April, confirming what I've been hearing elsewhere: Homeschooling Surge Continues Despite Schools Reopening.
The coronavirus pandemic ushered in what may be the most rapid rise in homeschooling the U.S. has ever seen. Two years later, even after schools reopened and vaccines became widely available, many parents have chosen to continue directing their children’s educations themselves.
Families that may have turned to homeschooling as an alternative to hastily assembled remote learning plans have stuck with it—reasons include health concerns, disagreement with school policies and a desire to keep what has worked for their children.
[A Buffalo, New York mother] says her children are never going back to traditional school. Unimpressed with the lessons offered remotely when schools abruptly closed their doors in spring 2020, she began homeschooling her then fifth- and seventh-grade children that fall. [She] had been working as a teacher’s aide [and] knew she could do better herself. She said her children have thrived with lessons tailored to their interests, learning styles and schedules.
Once a relatively rare practice chosen most often for reasons related to instruction on religion, homeschooling grew rapidly in popularity following the turn of the century before [it] leveled off at around 3.3%, or about 2 million students, in the years before the pandemic, according to the Census. Surveys have indicated factors including dissatisfaction with neighborhood schools, concerns about school environment and the appeal of customizing an education.
As usual, even a good article gets some things wrong. Home education is no new phenomenon, but as old as the hills. Abraham Lincoln was just one of many homeschooled presidents, though in those days they called it "self-educated." And for a very long time it had nothing in particular to do with reasons of religion. Children were home-educated by necessity (schools unavailable, or children needed at home, e.g. Lincoln), because of an intellectual mismatch between child and school (e.g. Thomas Edison, Albert Einstein), because the atmosphere and philosophies of the schools differed significantly from those of the parents (sometimes associated with a particular religion, sometimes not), or simply because parents and/or children were dissatisfied with what the schools had to offer. In the last quarter of the 20th century, it is true, homeschooling ranks were swelled by Evangelical Christians who had discovered that the Amish were right: home education could meet their needs better than public or even Christian schools. This raised the public's awareness of an educational phenomenon whose adherents had mostly been trying to fly under the radar, and led to home education's establishment as a valid and legal educational approach—at least in the United States. This new familiarity—nearly everyone now knew a homeschooling family—opened the field to many others, with varied reasons for their choices.
The proportion of Black families homeschooling their children increased by five times, from 3.3% to 16.1%, from spring 2020 to the fall, while the proportion about doubled across other groups. [emphasis mine] ...
“I think a lot of Black families realized that when we had to go to remote learning, they realized exactly what was being taught. And a lot of that doesn’t involve us,” said [a mother from Raleigh, North Carolina], who decided to homeschool her 7-, 10- and 11-year-old children. “My kids have a lot of questions about different things. I’m like, ‘Didn’t you learn that in school?’ They’re like, ‘No.’”
[The mother from Buffalo] said it was a combination of everything, with the pandemic compounding the misgivings she had already held about the public school system, including her philosophical differences over the need for vaccine and mask mandates and academic priorities. The pandemic, she said, “was kind of—they say the straw that broke the camel’s back—but the camel’s back was probably already broken.”
I find it especially exciting that minorities are discovering that they are not locked by their circumstances into an educational system that is not meeting their needs. The pandemic restrictions have given families of all descriptions the opportunity to taste educational freedom*, and many, having made that leap unwillingly, have chosen to stick with it.
Choice is the thing. If the great relief expressed by many parents at the re-opening of schools is any indication, I'd say that home education is unlikely to become a majority educational philosophy in America. But it works so well for so many families, including those who opt for different educational choices at different times in their lives—we ourselves made use of public, private, and home education at one time or another—that I'm thrilled to see homeschooling on the rise all over the country, and even the world.
Our established educational system is understandably threatened by any challenge to its power. (Nonetheless, we had many teachers who cheered on our own homeschooling efforts.) But powerful monopolies—in education as well as government, medicine, transportation, information, and all other essential services—are dangerous, even to themselves. Healthy competition can only make our public education better.
One new homeschooling mother summed it up well:
It’s just a whole new world that is a much better world for us.
*I realize that many homeschoolers are cringing at the idea that the at-home learning offered by schools (public and private) during the pandemic bore any resemblance to the true freedom of home education, since it usually attempted to replicate as much as possible the restrictions inherent in formal, mass instruction. Nonetheless, it opened eyes ... and doors.
The Virus and the Vaccine is a cautionary tale about the hasty development and widespread, rapid distribution of a vaccine against a devastating virus, created using a brand-new technology. It's a fascinating and frightening story, and my review is here.
I posted that review in 2005; the story has nothing to do with COVID-19.
The virus was poliovirus, and the vaccine was the Inactivated Poliovirus Vaccine, developed by Jonas Salk. The new technology was growing the polio virus in cultures made from ground-up monkey kidneys, instead of the traditional time-consuming process of using living monkeys. This sped up the research enormously and made the rapid development of the vaccine possible.
Polio was in the midst of a tremendous surge at the time, and parents welcomed a vaccine against the terrifying disease, which killed and paralyzed and particularly targeted children.
But there was a time-bomb hidden in the vaccine: SV-40, a monkey virus that survived inadequate purification procedures to contaminate nearly every dose of polio vaccine between 1954 and 1963, affecting about a hundred million people in the United States alone. (I was undoubtedly one of them.) Even after the contamination was discovered, the dangers were downplayed—contaminated batches were not recalled, but continued to be used—because it was widely accepted that the monkey virus, being from a different species, would do no harm.
Unfortunately, that proved to be a false and costly assumption. SV-40 is now known to be carcinogenic, and since the mid-1990’s has been discovered in many formerly rare brain and bone cancers, as well as lymphomas and leukemias. Is this a cause and effect connection, or a coincidence? The government and medical authorities are still downplaying the issue, because it does not concern the present-day polio vaccine. But even though the Centers for Disease control say in one place on their website that there is no connection, research reported on another page flatly contradicts that.
Does it matter now? SV-40 is no longer contaminating the polio vaccine. As calamitous as these cancers are, when weighed against the devastation caused by the polio virus itself, it is a reasonable post-facto conclusion that the benefits of continuing to administer the contaminated vaccine outweighed the risks.
What does matter is that the authorities of the time were wrong about the science, and knowingly exposed over half the population of the United States to the contaminated vaccine.
Polio was such a devastating and commonplace childhood disease that parents willingly, nay eagerly, accepted the assurances of the authorities and authorized the vaccine for their children.
Back in 2005, I ended my review of The Virus and the Vaccine with a pro-vaccination message, which I still believe today. But my confidence in the governmental and medical authorities is now at an all-time low, and Big Tech has joined that list. Our vaccine production may be safer today—though maybe not, given that many vaccines are produced in China—but it's abundantly clear that we still get the science wrong, we still suppress information, and we still interfere unreasonably in the medical decisions of others.
When I was very young, my mother used to make apricot-pineapple conserve. I have the recipe; it's simple, just dried apricots, crushed pineapple, and sugar. The tricky part is that the mixture, while cooking, bubbles and spits and must be stirred constantly. My father made a long, L-shaped wooded paddle so she could stir from beyond the surprisingly-long range of the very hot mixture. When my mother made conserve, it was an event.
Which may be why I've only tried the recipe once or twice. That, coupled with the fact that Porter doesn't care much for apricots and even less for pineapple, so other jams take much higher priority around here.
But I miss it, and am always eager to try it out when I find a jar in the grocery store. But those occasions are rare.
Then I got smart.
There, at our local Publix, was the solution. Well, not the ideal solution, but a great deal easier than making my own. Mixed together, the flavor is just about as I remember it, though the texture is a bit thicker. One of these days I still plan to make it from scratch, even though I lack my mom's amazing paddle. But in the meantime, this provides an awesome gustatory memory.