Saturday, February 08, 2025

Don't Think About Elephants

You know that old gag, right? When someone tells you not to think about elephants, you can't stop thinking about elephants even if elephants had been the farthest thing from your mind.

I was talking to ChatGPT about my stories this morning, trying to figure out why, after discussing my characters' motivations, it kept bringing up ideas I had told it to discard. For example, the actions of the maidens in my Arthurian stories are always rationalized in terms of feminine agency and independence even after we'd come to an agreement about the primacy of sexual interdependence. The girls were feminine and needed a knight, but the prose always mentioned agency and independence, if only to justify their femininity.

I finally figured out how to ask ChatGPT why it was happening in a way that gave me a meaningful response. Here's what it said.

What’s happening is actually tied to how I process context and language. When you provide detailed discussions involving keywords describing the rationale behind those traits, my system interprets that context as important to include in the output—even if your intention is to avoid drawing attention to it. Essentially, the system assumes that if you’ve mentioned something repeatedly, it’s central to the scene or discussion.

By having long conversations about cultural evolution in a primal world, I had poisoned the AI with ideas alien to my stories.

If I don't want it to think about elephants, I should simply not mention them.

Unlike you, he is not thinking about elephants.

This is a Gemini image. It's nice, but it doesn't give the same flavor as the ChatGPT ones.

No comments: