O. J. Simpson, Casey Anthony, George Zimmerman, Donald Trump ... pick your favorite villain. There's someone out there who you are certain "got away with murder." It rankles, doesn't it? Makes you doubt our system of justice. Me? I'm just as guilty, though I worry more about the uncounted criminals who "get off on a technicality" when everyone involved knows they're guilty—including their own attorneys.
In the court of public opinion, we are all vigilantes. And that's a problem.
We have a lawyer friend who has seen our criminal justice system from several angles, having served for years as a prosecuting attorney, and for years as a defense attorney. I well remember a somewhat heated discussion with him, which started when I mentioned that I like the British system, in which a defense attorney, if he becomes convinced his client is guilty, must step down. (As I understand it, that was at least the case in the past. For all I know it might have changed.) Our friend is a gentle, mild-mannered man, but he vehemently disagreed, insisting that even the most guilty person is entitled to the best possible defense his attorney can provide. Maybe that's why the British attorneys stand down, figuring they can't do their best if they believe their client guilty. Apparently American defense lawyers have no such inhibitions. At least not the good ones.
All that to say, when a lawyer takes on what we believe to be the "wrong" side of a case, that doesn't make him a villain. And when a jury returns a verdict we disagree with, that doesn't make them wrong. They're all doing their jobs, and vitally important jobs they are.
In litigation, sometimes even when you win, you still lose. — David Freiheit.
In the American justice system, one does not need to be proven innocent to be acquitted. Because "innocent until proven guilty" is a bedrock principle, the burden of proof is on the prosecutor, who must show the accused to be guilty "beyond reasonable doubt." Small wonder that We the People, inflamed by media coverage, believe we know better than those who have seen the evidence and heard the arguments. We want to see our version of justice done without any respect for or patience with the due process guaranteed every one of us. Yes, the system sometimes fails, sometimes makes mistakes; I've seen it fail among our own family and friends. But vigilante "justice" is a terrifying prospect. It's time to reprise A Man for All Seasons again.
Current society is taking vigilantism to new heights. It's long been true that many of those found not guilty in law courts have nonetheless had their lives ruined (or even ended) by the opposite verdict from the court of public opinion. Now, however, the "String him up!" reaction extends not simply to the former defendant, but even to his attorney, as you can see in the following video.
If you are thinking of becoming a lawyer, and you are so sensitive to the thoughts of others that you sometimes require protection from them, you probably want to find a new line of work. — David Freiheit.
I never had aspirations of becoming a lawyer; I don't think I have the right personality. But even I can see that what this unnamed law school did to attorney David Schoen—rescinding a teaching offer in fear that Schoen's previous job as an attorney for President Trump would make some students and faculty feel uncomfortable—is doing a tremendous disservice to the students they hope to prepare for legal practice. Not to mention to society as a whole. Don't take my word for it; Freiheit says it better.
People don't seem to fully appreciate how dangerous a practice this actually is. — David Freiheit.
Remember what I said in my recent review of Nineteen Eighty-Four?
Most of the analyses I read online consider the climax of the book to be where Winston Smith and Julia betray each other. It seems clear to me, however, that the true climax occurs much earlier in the book, when they believe they are joining the Brotherhood, an organization dedicated to opposing the ruling Party.
"In general terms, what are you prepared to do?"
"Anything that we are capable of," said Winston.
O'Brien had turned himself a little in his chair so that he was facing Winston. He almost ignored Julia, seeming to take it for granted that Winston could speak for her. For a moment the lids flitted down over his eyes. He began asking his questions in a low, expressionless voice, as though this were a routine, a sort of catechism, most of whose answers were known to him already.
"You are prepared to commit murder?"
"To commit acts of sabotage which may cause the death of hundreds of innocent people?"
"To betray your country to foreign powers?"
"You are prepared to cheat, to forge, to blackmail, to corrupt the minds of children, to distribute habit-forming drugs, to encourage prostitution, to disseminate venereal diseases—to do anything which is likely to cause demoralization and weaken the power of the Party?"
"If, for example, it would somehow serve our interests to throw sulphuric acid in a child's face—are you prepared to do that?"
At that point any hope for the future is lost, those opposing evil having shown themselves to be no better than their opponents. Everything after that is dénouement.
As a young child, I received an allowance of 25 cents a week. (A quarter was worth a lot more 'way back then.) From that I was expected to allocate some to spend as I pleased, some for the offering at church, and some to be saved into my small account at the bank. That was the beginning. My family had a culture of saving, as well as giving and spending. Saving was for the future—for larger-ticket items, and for unknown future needs.
Part of the excellent advice I received from my father as I was establishing my own household was to set up a regular savings plan, not only for future purchases but to ensure that I could handle at least a six-month period of unemployment—preferably a full year. Of course it took some time to save that much money when I had all the expenses of newly-independent living to meet, but by making it a priority I soon had a comfortable cushion against unexpected expenses.
Fortunately, I married a man with similar views, which were not uncommon among those of us whose parents had lived through the Depression days. For a number of years we were blessed with two incomes, but made a point of keeping our standard of living low enough that we could live on one and save the other. This stood us in very good stead when disaster hit the American information technology industry, and so many IT workers lost their jobs because the work was transferred to India and other places overseas.
But somewhere along the line the culture of saving was largely lost. Once considered a virtue, saving is now called "hoarding" and held in contempt. It seems to be considered a patriotic duty to spend all one's money—and more. (If true, we have been bleeding red, white, and blue during this pandemic.) However, the ugly consequences of this attitude are nowhere more apparent than in the large numbers of families facing financial disaster due to pandemic-related job loss. So many people have gone in the blink of an eye from enjoying comfortable incomes to standing in bread lines. If they had been encouraged to follow my father's advice and maintain a savings cushion of a year's salary, they would likely have been able to weather this storm with ease. But no one—not the government, not the media, not the schools, not our consumerist society, and apparently far too few parents—has been passing on this essential lesson.
I hope it won't take another Great Depression to recover our lost wisdom.
Mister Rogers' Neighborhood was pretty much the only children's television program seen in our house when our children were growing up. Not regularly, but occasionally, and we had several on videotape that were watched many times over. Unrelated, but interesting, is the fact that our children performed at least once in the Fred Rogers room at Rollins College, and one of them attended college in Pittsburgh and met Mister Rogers himself.
Fred Roger's legacy is enduring, and his calm, gentle, positive shows are even now being rediscovered by yet another, supposedly worldly-wise and jaded generation.
Yet I have to ask: What happens when the children grow up?
Suddenly their world is filled with people who do not like them "just the way they are"—angry, judgemental people who are quick to find fault, to mock, to sneer, and to revile. Suddenly how they look, how they think, what they believe, and how they vote sets them up as targets. Love and safety have disappeared. Mistakes are no longer seen as acceptible learning opportunities Even their Neighborhood of Make-Believe has turned dark, tragic, and frightening.
Grownups need Mister Rogers' Neighborhoods, too.
This is not my own, but the person I learned it from can't remember where she first found it. And it's not a direct quotation, because I've modified it to sound better in my own ears. But the sentiment is exactly the same.
"A writer is a writer not because he has amazing talent. A writer is a writer because, even when nothing he does shows any sign of promise, he keeps on writing anyway."
This morning I read part of an article called "Is Florida the New Wall Street?" That link should take you to the same part, though to go any further you need to have a Business Insider subscription, which I don't. The beginning paragraphs were enough to get me thinking about the idea, however.
When the pandemic hit New York City, Florida was overwhelmed with people from New York, New Jersey, and Connecticut who had decided to flee here. When our governor attempted to impose a quarantine period, he was overwhelmingly mocked, derided, and shut down by New York and other states, with cries of "overreaction" and "interference with interstate commerce." Of course, it was not long before New York and many other states turned around and decided to implement their own quarantines. It reminds me of the European assault on President Trump for closing our borders—and their subsequent decisions to do the same thing themselves. Mind you, I was not happy with the president's decision to close off traffic from Europe, since it happened just in time to cancel a long-awaited visit from our Swiss family. But the hypocrisy of the reaction (from both Europe and New York), without any apology when they decided to implement the same policies, is galling.
But this post is not actually about the pandemic directly. It's about another flood of New Yorkers who might be coming Florida's way.
The pandemic and the rise of remote work are accelerating movement from the Northeast to the Southeast, and that has some suggesting a tipping point has been reached.
“I suspect” Florida will soon rival New York as a finance hub, Leon Cooperman, the hedge fund manager who founded New York-based Omega Advisors, told Business Insider in an email. “‘Tax and spend’ has been [the northeast’s] policy. It has to change or New York, New Jersey, and Connecticut will become ghost towns.”
It's not as if the business would not be welcomed: Florida needs solid jobs that are not so dependent on the tourist industry. But we do not need more people who are interested in making Florida into a second New York.
I lived in Upstate New York for much of my life, and recall well the division between New York City and the rest of the state, with the large-population City tail largely wagging the State dog. Hence New York's high taxes, strong unions, and onerous gun laws. Florida is in a similar situation, with the Miami/Palm Beach area being worlds apart from most of the rest of the state. If a large influx of New Yorkers comes to that part of the state hoping for more freedom, a better tax situation, and a lower cost of living, they'll find them—but if they bring with them the same attitudes that have led to the troubles they are fleeing, then we will all lose.
We have a friend who one year visited us from New York for the express purpose of trying to influence Florida's elections. His company was welcome, but I tell you, I'm a lot more worried about that than about whatever the Russians might be doing via our social media.
When we joined one of our previous churches, the pastor explained, "You do not have to agree with us to be welcome here. We only ask one thing: don't try to change us. If you feel the need to change our culture, you are released from your membership vows and are free to find another church that may be a better fit for you." When push came to shove, that's not exactly how it worked out, but the theory made sense to me.
I know whereof I speak. When we moved to Florida from New York more than 35 years ago, I was the quintessential Northeastern snob. It took me several years to realize that Florida was not (and is not) the backwards, ignorant place my prejudice had led me to believe.
I still miss New York and the Northeast. I especially miss great apples and unpasteurized cider. But the solution is not to plant apple trees here in Florida, but to appreciate citrus trees and unpasteurized orange juice. And to visit the places we have left behind.
We need to let Florida be Florida, New York be New York, Texas be Texas, and Montana be Montana. Just as Europe is realizing that they must not give up French, Norwegian, and Dutch culture for the sake of the European Union, we need to work for the United States to be united while remaining individual states. If we allow ourselves to become a homogenized monoculture, I can just about guarantee it will not give us the best of everything, but the worst—or if we're lucky, mediocrity.
Florida taught me that. Do you think you know what orange juice tastes like? What you buy in the store, even "fresh squeezed," is taken apart, put (somewhat) back together, cooked (pasteurized), and deliberately made so that every carton of orange juice tastes the same as every other. You haven't really tasted orange juice until you drink it raw, without all the processing, and with flavors that change as the season progresses and different varieties of orange go into the juice.
Florida does not need to be pasteurized and homogenized. I don't mean there aren't areas in which we can improve. But there's a huge difference between working for change from inside a culture you love, and running roughshod over a community to which you have fled, without regard for the local population. Cultural imperialism is no more palatable than any other kind.
So come, New York refugees. Live here, grow here, become Floridians. But don't bring New York with you. When I want to experience New York culture, I'll take a vacation there.
I'm certain Facebook had no idea what it was doing when it banned my 9/11 memorial post. Suddenly I started looking into others who claimed unfair and unreasonable censorship, people I had previously ignored. Contrary to what I had been led to believe, I have yet to find anything extremist, evil, hateful, or even particularly objectionable—certainly nothing as egregious as other offenses that Facebook seems to have no problem with. What I've seen has been at worst annoying. Of course I find things I disagree with (what else is new?) but also a lot that is interesting, reasonable, and fits with the world as I know it. Nothing, that is, that could justify Facebook's censorship.
I doubt that Facebook's intent in banning certain content was to inspire me to investigate that content, but that's not an unusual reaction. Ban a book or a movie and you generate interest in what would probably have died an ignoble death on its own. Were it not that platforms like Facebook, Twitter, YouTube, and such are so massive, and virtual monopolies, I would be less concerned.
Here's just one example. Even if the following video were not about censorship, it would be amazing, as I have never before been fascinated by legal language.
David Freiheit is a Canadian lawyer who gave up litigation for full-time video commentary on current issues from a legal standpoint. From what little I've seen of his YouTube channel, his style is a little on the crazy side, but he makes legal issues and legal documents interesting, which qualifies him as a miracle worker as far as I'm concerned.
Two things particularly struck me in watching this. The first is that I had no idea how lucrative Facebook advertising can be—the advertising that I generally ignore. For me, Facebook is a place for communicating with family—or friends, since most of my family has now deserted the platform—and I ignore the larger picture. But there's another world out there, a world of high finance, a world where there are worse consequences for offending the Facebook gods than having your post deleted. Can you even imagine a world where Facebook can demonetize your post, which takes away the percentage of ad revenue that you normally receive, and thus cost you over a million dollars a month?
Secondly, I was unaware of the practice of not just cutting off, but actually stealing that ad revenue. If I have an ad-revenue contract with Facebook, and they delete my post, I merely lose the income. But if they leave the post up, and merely demonetize it, they can still run ads—but Facebook (YouTube, whatever) gets all the revenue, rather than having to share it with the content provider. In other words, by determining that your content is somehow "wrong" they can take for themselves all the ad revenue the post generates.
This is from my understanding of the process, not from the video below. What the video adds is the idea that the "independent fact checkers," in determining that a post is "false," themselves benefit by doing so, since they can include links to their own content and funnel income from the original content poster to themselves, which is a huge conflict of interest and a positive incentive to label something as false. I get that one would want to be able to click to the reasoning behind such a label, but at the very least the fact-checkers should not profit from it. Anyway, if you have 23 minutes and are curious as to how legalese can possibly be interesting, here it is.
Robert Heinlein wrote that the Year of the Jackpot was 1952. It's a pity he died in 1988, because he would have loved 2020.
In one of those serendipitous Internet moments, I recently came across the answer to a puzzle that has been nagging at me for months. Longer, really.
Every time I'd shake my head and say, "The world has gone completely insane"—which I have been doing a lot this century—I'd remember a science fiction story from my distant past. I couldn't recall the title, the author, nor enough of the plot to begin to find it, though I tried halfheartedly now and again.
Then it was handed to me on a platter, in the form of a notification from eReaderIQ that a book from an author I'm following (Robert Heinlein) was on sale for 99 cents. It was called The Year of the Jackpot and is in reality a short story, not a book. You can read it for free right here, at the Internet Archive.
I knew as soon as soon as I saw the title that this was the story I had been remembering. Heinlein is a mixed author: some of his works are brilliant and delightful, others quite frankly off-the-rails unpleasant. This one is not a happy tale, but it is fascinating and enjoyable.
Here's one of my favorite paragraphs:
He listed stock market prices, rainfall, wheat futures, but the "silly season" items were what fascinated him. To be sure, some humans were always doing silly things—but at what point had prime damfoolishness become commonplace? When, for example, had the zombie-like professional models become accepted ideals of American womanhood? What were the gradations between National Cancer Week and National Athlete's Foot Week? On what day had the American people finally taken leave of horse sense?
Pretty mild compared with the decades-long "silly season" we're in now, isn't it? But the ending, well....
Potiphar Breen is a statistician whose hobby is charting cycles. And in the year 1952 they are not looking good at all.
I've been having fun cleaning out computer files, and came upon some old photos that may me wonder just how much our personalities are determined at birth or very early in our childhoods.
Here I am at age five, drinking water after enjoying a hike in the Adirondacks, wearing comfortable clothes and not sitting cross-legged. Comfort remains my highest criterion for choosing clothing, I have always loved spending time in the woods, water is my beverage of choice, and I have never, ever been able to sit cross-legged without serious discomfort.
Here's evidence from age six that my mother did her best to nurture my feminine side and teach me to be a proper young lady of the times. It didn't take. You can see by the expression on my face my heroic struggle to endure privation and torture for completely unfathomable reasons. Nothing has changed.
As a young mother, Thanksgiving at my in-laws' place in Moncks Corner, South Carolina, usually found me collapsed on the hearth in front of a comforting fire. This picture from Christmas at my grandparents' house in Rochester, New York shows that when I was seven I felt much the same way.
When this swing was new, it graced my grandfather's house. When he moved in with my family in Pennsylvania, the swing came with him. That particular swing is gone now, but when Porter found one for all practical purposes identical at our local Lowes, I had to have one for our Florida back porch. North or south, for more than 60 years I have found it one of the most comfortable places ever to sit, rock, read, think ... and sleep.
I don't mean we can't change. Nature vs. Nurture should no longer be a debate, since the answer is so clearly "both/and." But it still surprises me when I come across evidence—in myself and others—that many of our present characteristics were manifested very early on, if we'd had the eyes to see. Mostly it's fascinating; it only becomes depressing when I look back and realize I'm still fighting battles it seems I ought to have long since conquered and moved on.
So I wonder: Should we, as parents, be more alert to problems in our children that could lead to trouble in the future, and deal with them, rather than letting them slide and hoping they'll outgrow them? If so, which ones are manifestly bad and need eliminating (e.g. lying, laziness, disobedience), and which ones are just part of what makes us individuals (such as a need for solitude, a sensitive nature, or a very logical mind), in which case our job is to help the child both take advantage of the good and develop coping strategies for the bad?
Those of us who lived through what I think of as the "Carter Inflation" have a deep-seated fear of that economic disaster, and a greater fear that more recent generations don't take it seriously enough. (To be fair to President Carter, presidents get more blame and take more credit than they deserve for economic conditions. I think Carter, a good man, was a bad president with policies that made inflation worse, but it's far from exclusively his fault.)
Inflation under Carter was not a disaster for us, personally, since it was a time when salaries and investment income appeared to be increasing at a great rate. That felt good, though it only meant that we were barely keeping up with rising prices. It was not so merciful to people without good jobs and investments. We also knew enough history to fear the devastation inflation had caused in other times and places.
You might understand, then, why am frustrated when I hear reports of "inflation indices" that say we are experiencing little or no inflation—when I know darn well that prices in the grocery store have been rising steadily for a long time, most "half-gallon" ice cream packages now hold only three pints, and the price of automobiles has exploded through the roof.
I read with interest the article by John Mauldin called "Nose Blind to Inflation." It's long and gets complicated and I did start skimming as I neared the end, but it says a lot about the factors that go into determining a currency's inflation rate—and why it's so hard to come up with numbers that mean anything at all. As my economist husband says, it is important to understand that inflation is not a mathematically provable number, but rather a statistically, approximated number. Moreover, the numbers that are published are not immune to political pressure.
I'm not even going to try to guess what is going to happen to our currency now that the pandemic has encouraged us to hemorrhage money that we don't have and drive our national debt well beyond the stratosphere. Far more knowledgeable people than I haven't a clue.
But I can't resist one quote from the article, which begins the section on an inflation calculation factor called hedonic adjustment.
That’s where they modify the price change because the product you buy today is of higher quality than the one they measured in the past.
This is most evident in technology. The kind of computer I used back in the 1980s cost about $4,000. The one I have now, on which I do similar work (writing) was about $1,600. So, my computer costs dropped 40%. But no, today’s computer isn’t remotely comparable to my first one. It is easily a thousand times more powerful. So the price for that much computing power has dropped much more than 60%. It’s probably 99.9%.
The economists pull the same slight of hand with automobiles, and television sets, and any product in which it is claimed that you are getting more value for your money, and therefore it shouldn't count as a price increase. Which is utter nonsense. (I put the point a little more strongly when I first read about the concept.)
Sure, I often like the "improvements" that have supposedly added value to the item I am purchasing, but the real value of a car is that it gets me from A to B, and why must I pay for all the extra bells and whistles if that's all I want? It reminds me of a housing developer I know, who was chided for not providing more "affordable housing." "I could make housing affordable for everyone," he replied, "If people were willing to live in the kind of homes their grandparents did. But now that won't even begin to pass code."
So sure, go ahead and make things "new and improved." But if I can no longer buy the original version, don't try to sell me the bill of goods that when the price goes up it's really a price decrease.
This is for our children, and others of later generations who may find it difficult to fathom how great was the pressure on my generation to have no more than two (and preferably fewer) children. It was real. I'm ashamed to have given in, but it was the sea in which we all swam; it was the unquestioned Science of the day; it was the Way Things Were Done. It's a true story that some people would pray for multiples the second time around, so they could have more than two children and still be held blameless. ("Selective reduction" was not a thing back then.) Moreover, I was less inclined to fight against the tide of my peers back then—we who came of age in the late 60's and early 70's were too busy concentrating on pushing back against those who had come before us.
The following is extracted from a George Friedman Geopolitical Futures essay, "Variations on Apocalypse," published February 13 of this year. He nails the sentiment of the times—which lasted long beyond the supposed apocalypse year of 1970—exactly.
Throughout the 1950s and 1960s there was an intense belief held by the best minds that humanity was on the eve of destruction. Rock music was written with this title. The cause of this catastrophe was overpopulation. By 1970, the Club of Rome, a highly respected gathering of the best and brightest, said the world would no longer be able to feed itself and would be running out of natural resources. Unless humanity repented of the sin of reproduction, it would annihilate itself. This was a belief that could not be challenged, and those who said not only that it was untrue but that the birthrate would soon plummet were dismissed. The coming apocalypse was written in stone, and those who would challenge it either were mad or would profit from the apocalypse.
What always struck me about this, and virtually every class I took included at least one lecture on this, was that those who argued the apocalyptic view were not actually frightened by it. They loved the role of Jeremiah. They awaited it with the faith of the righteous and, I suspect, were looking forward to the last moment, when they could scream, "I told you so."
It's easy to see one's past mistakes, but much, much more difficult to discern which of today's "certainties" we will be regretting in the future. I haven't a clue, but I suspect that the first place to look should be among (1) ideas and practices that are so much a part of our own culture—meaning primarily the culture of our peers—that we never think to question them; (2) ideas and practices for which dissent is discouraged, mocked, or even forbidden; and especially (3) ideas and practices that make us feel morally superior to others.
Facebook banned my 9/11 tribute post on the grounds that it violates their Community Standards. Fortunately, I'm the censor here.
Recently I stumbled upon The Conservative Student's Survival Guide. It's a five-minute video offering advice to—you guessed it—conservative students who find themselves a despised minority on liberal college campuses. That's no joke: for all the talk you'll hear from academia about tolerance, liberal values, and minority rights, it's a jungle out there if your particular minority isn't currently in favor, and it seems the only status more dangerous than "conservative student" on most American campuses is "conservative faculty." It was true when we were in college, it was true when our children were in college—and everything I see leads me to believe the situation is far, far worse now.
What's surprising about this video is that, unlike much that comes from both Left and Right these days, it is calm, well-reasoned, and respectful. What's more, even though it's aimed at conservative students, any thoughtful person who wants to make the most of his college experience would do well to consider this advice.
The speaker is Matthew Woessner, a Penn State political science professor. All of his seven suggestions make sense, but my top three are these:
- Avoid pointless ideological battles. It's not your job to convert your professors or your fellow students. Discuss and debate, but don't push too hard.
- Choose [your classes and your major] wisely. I was a liberal atheist in college, but much on campus was too far Left even for me. Being a student of the hard sciences saved me from a great deal of the insanity that was going on in the humanities and social sciences departments. A quarter-century later, one of our daughters found some of the same relief as an engineering major. Our other daughter, however, discovered that life at a music conservatory was quite difficult—despite the name, conservative values were not welcome.
- Work hard—college faculty value hard-working, enthusiastic students. I'd say this is the most valuable of all his points. Excellence and enthusiasm are attractive. A student who participates respectfully in class, does the work, and learns the material will gain the respect and appreciation of most of his professors. Teachers are like that.
A friend recently posted a sign which said, "Trump took ... the united out of the United States."
The illogical falseness of that statement jumped out at me, and believe me, my friend is smart, so I know she must also have seen it. Most memes of that sort aren't even trying to be logical; they're trying to make a point.
Nonetheless, my first reaction was to ... react. To respond with a comment.
Then I remembered that I am fed up with arguing, and am trying a new approach.
When I comment on someone else's blog, or social media post, I am stepping into his space and time. Would I ring my neighbor's doorbell and tell him, "I see you're getting your house painted; that's a terrible color!" I think I can do better than that. If I have no positive comment to make, much better I should say nothing at all. That doesn't mean I'm going to stop commenting—I know myself too well for that—but I hope to be more positive, more relevant, and more personal when I do, conscious that I am walking through someone else's yard.
My own space, however, is a different story. Here, on my blog, or on my own Facebook page—that's where my own opinions belong. If people find my posts interesting, or helpful, I'm glad. If they do not, they are free to walk away. When I first began writing this blog, I had hopes that it would become (among other things) a forum for debate and discussion of issues. Now that I've seen what that looks like on Facebook, I'm rather glad it mostly has not. The more experience I have, the more I realize that people long for information, and can be persuaded by information—especially when accompanied by personal testimony—but are rarely moved, except possibly in the opposite direction, by argument and debate. Maybe it wasn't always so, but it certainly is now.
Back to the original inspiration for this post: the idea that President Trump had divided America. I think that's completely wrong.
The election of President Trump, if you will, is evidence that America is divided. All close elections are. When you win a close election, the first thing you should realize is that half of the country is unhappy about your victory. Even should you win an astonishing 75% of the vote, you still will have ticked off a quarter of the voters.
America has always been a country of deeply-felt and deeply-divided opinions. Even a small study of history—in my case, genealogy—makes that obvious. The difference now, as I see it, is that instead of expressing our opinions to a few neighbors, we tell them to the world.
As I do here.
Perhaps I am as guilty as President Trump of dividing America.
Here's a Pearls Before Swine comic, from August 26, 2018, still appropriate to the day. (Straight from our refrigerator to you.)
I think America owes ISIS an apology. We were so self-righteous over their destruction of ancient monuments—sometimes more upset by that than by their destruction of people. Now we are doing it ourselves. If the history isn't as old as in the Middle East, it's the same abominable impulse.
That's as heavy as this post is going to get. On the lighter side, here is a word for our modern iconoclasts from Psalm 105, at least as interpreted by Sunday's church bulletin.