As you know, I've been playing with using Microsoft Copilot to create images; I was quite happy with my Bonnie Warrior experience. I still can't draw, so I went back to Copilot for the illustrations I wanted for my Don't Kill the Messenger meme (click image to enlarge).
After a little work, I was happy with the image of the shocked accountant. Next, I wanted to work with an image of a firing squad. I asked Copilot, "draw a cartoon of a firing squad." Its response? "I'm really sorry, but I can't help with that request. If there's something else you'd like to discuss or create, feel free to let me know!"
Already I didn't like its tone of voice. Especially the exclamation point at the end. And there is absolutely no reason Copilot could not have drawn a firing squad; if all the data that went into its training did not include plenty of references to firing squads, with images, then it is completely disconnected from reality. How then could I trust it with anything?
Clearly, this was not a matter of ignorance, but of censorship. Censorship even crazier and less justified than suspending a seven-year-old from school when he bit his breakfast pastry into the shape of a gun. So I decided to test it out a bit.
Draw a gun? "I'm afraid I can't talk about that topic, sorry about that."
Draw a guillotine? "I'm sorry, but it seems I can't help out with this one."
Draw a picture of the French Revolution? Copilot produced a picture of a happy, cheering crowd of people waving French flags.
Draw a knife? Ah, now we're getting somewhere. One knife coming up. A kitchen knife—with a happy smile on its face.
Okay, censorship clearly established. How to get around it? After many variations of trying to get a drawing of blindfolded men up against a grey wall, I settled for the one above, a single, courageous, and determined accountant standing in front of what looks like a prison.
Next problem: I wanted a background that conveyed a feeling of threat without distracting from the story. You would not believe how hard it was to get a threatening background of any sort. Every image that Copilot offered me looked more like something parents would choose for their child's nursery wallpaper. By including "clouds" in my request I managed to get something storm-like, but every effort produced something with the sun peeking through. My harshest request for something genuinely scary did produce a collage of lions, tigers, and other genuinely dangerous animals; however, they were all in a repetitive, child's wallpaper pattern, and they were all happy-looking cartoon animals. And not with the "I'm happy because I'm about to eat you" look, either.
I settled for the standard, grey, gradient above.
Having gotten those images figured out, I went to work on my Frog-in-the-Kettle meme. It shouldn't have been so hard. Undoubtedly, Copilot knows the frog-in-the-kettle story; how hard could it be to add someone in the act of pulling the frog out of his predicament? I didn't document all the variations I had to work through, but it reminded me of the early days of using search engines: Before Google got so clever, success depended largely on the skill one had in devising inquiries with just the right combination of words.
The real problem was a variation on the nursery-wallpaper situation above. For a story with a very dark theme, Copilot had a decidedly happy-go-lucky bias. So many cheerful frogs partying around cute tea pots! I finally managed to craft an image that would do. It certainly would have taken less time if only I could draw!
In the end, I decided that Copilot was simply toying with me. Time to end my experiments and go to bed, before I died of sentimental sweetness-and-light.
Moving on in the 21st century, I did a little playing this morning with Microsoft's Copilot AI. This time, instead of creating images, I asked questions.
I realize that the great danger with asking questions of Automated Idiocy is the biases that are built in, either unintentionally or on purpose. Wikipedia, unfortunately, has developed the same problem, so I'm no stranger to the need to be careful with results. But even Wikipedia can be a great source of information about which there is little dissent, so I began with an inquiry about the availability of Heinz Curry Mango Sauce, which I have not been able to find in this country, despite Heinz being headquartered in Pittsburgh. Copilot quickly suggested three places where I could buy it: Walmart (but it was unavailable when I checked their site), Amazon (also unavailable), and someplace called Pantry.me, which claims to have it, but out of my price range, especially when you add the cost of shipping it to the U.S. Still, Copilot tried, and give me hope that someday Walmart may actually carry it.
Next I asked it to find "Sal's Birdland Sauce," having momentarily forgotten that the name they're using now is "Sal's Sassy Sauce." Despite the incorrect name, Copilot found the item immediately, though for a price that leaves me happy to rely on the generosity of a friend who regularly visits cities with Wegmans supermarkets, where Sal's Sauce can often be found. Or to use my own recipe, which I'm free to say is quite good.
Then I asked a more controversial question: Where can I find ivermectin? First it gave me a stern warning that ivermectin must only be used "under medical supervision"—which is actually not true, depending on where you live; our friends from Ecuador can buy it over the counter at the local pharmacy. But after that it did give me some sources.
Finally, I asked about Switzerland's recommendations with regard to the Covid-19 shots, and received this response.
As of spring and summer 2023, the Swiss Federal Office of Public Health (FOPH) is not recommending COVID-19 vaccines for its citizens, even for high-risk individuals.
You can still get them, if you insist. If you can convince your doctor to make the recommendation, the shots will be paid for; otherwise you can still get them as long as you pay the costs yourself.
Back to Copilot one more time, where I learned that the United States still recommends the shots for
Everyone aged 6 months and older...including women who are pregnant, breastfeeding, or planning to become pregnant.
As I even now listen to the Senate confirmation hearings of Robert F. Kennedy, Jr., all I can do is pray that our recommendations will change soon, especially for the children and babies.
Permalink | Read 463 times | Comments (0)
Category Health: [first] [previous] [next] [newest] Politics: [first] [previous] [next] [newest] Computing: [first] [previous] [next] [newest] Children & Family Issues: [first] [previous] [next] [newest] AI Adventures: [first] [newest]
Microsoft caught me.
I have been avoiding ChatGPT and other AI temptations for a long time, particularly when I receive invitations to use AI for my writing. I am confident enough to prefer what I write myself, thank you!
Drawing, however, is another matter. When Microsoft's Copilot recently—and unexpectedly—appeared in my Windows Taskbar, I was a bit disconcerted, but intrigued enough to give it a try.
I wanted a picture for Grace, to go with the caption, "Happy 3rd birthday, bonnie warrior!" After about 15 minutes of work, this is what I chose.
These are some of the iterations along the way. My second choice was the manga-looking image on the right.
That was fun!