Recently I've watched a couple of series that were well written, with a sterling cast. The episodes held my interest. Then the bad guys won. The endings were wretchedly unsatisfying. They made me feel the same way I did in the period after the election when the country was in so much turmoil
I was raised on ridiculously pleasant westerns. There was always a happy ending. The good guys won and the bad guys were defeated. It was especially satisfying considering there was broad agreement in our society about what constituted good and evil.
These myths--and I'm speaking of belief systems, not fairy tales--bind civilizations. Our belief systems about "we the people" shape our morality. We know when we sin. And we know when someone else does.
It's unsettling to me that so much of the art created now rewards what would formerly have been considered bad behavior. It's confusing and depressing. Granted some artists are honestly pulled in this direction. If it's your experience of life, I say more power to you and no one has the right to stop you.
Even if I hate every word you write, I applaud your courage to go right ahead.
But must so many TV shows and books have a bleak ending rewarding characters that make us cringe? It is simply not my worldview.
Rewarding people who commit terrible crimes erodes the myths that underlie our civilization. These myths aren't lies. They are basic truths. Good eventually wins. Justice prevails in the end. "The mills of the gods grind slowly, but they grind exceedingly fine."
We love heroes. For those of us who can remember what the good guys actually did and acted like, let's bring 'em back from time to time.