The Cross Disintegrates

I’ve been wondering about how people will regard Christianity in America in the future. This is for obvious reasons (the religious right, hypocrisy) and the personal (I love to speculate). Truth be told, I don’t see it being anything good.

First, it’s really obvious that the Religious Right et al has made Christianity synonymous with “Bigoted, sexist, homophobic, reality-denying wealth-worshiping asshole who’s a total hypocrite.” Yes, plenty of American “Christians” violate their own religious tenets which is obvious as hell when you have even a passing understanding of the teachings of Jesus. They also do not care that they are hypocrites and have no spiritual curiosity, if they ever had any. Honestly it’s kind of a joke how Christianity has gotten branded.

Secondly, the media has run with this because the Religious Right is loud. They have money, they are publicity hounds, and they are of course politically active – and useful. The Religious Right has been happy to get involved in everyone else’s damn life, and of course the media amplifies that. Plus the American media loves to both-sides things even when people are ranting or opportunist.

Third, the Religious Right is and will be defined by horrible things. Climate denial. Cruelty towards immigrants (despite a lot of that being critiqued in the Bible). Racism. Selling out. People will be hurt by this, people will be hurt by them, and they seem to enjoy that.

Fourth, and sadly not addressed, I think that non-religious right Christianity hasn’t really fought back. Sure I see some truly good people, you can find all sorts of people doing good things. But I don’t see a fight for the soul of Christianity in America which you’d think would be really freaking necessary. There’s so many people being utter assholes in the name of Jesus, you’d think there’d be a willingness to battle.

But I just don’t see it. Some of it sure, but not enough that’s big, bold and in your face. Christians should be utterly pissed at the legacy of grifters like Robertson and Falweel and the like. They should be out there in people’s faces. Heck, maybe some kind of big public act of repentance and penance that would name names.

For whatever reason, the Religious Right has defined Christianity these days. I don’t see that going away, barring some kind of gigantic Great Awakening/Bonfire of the Vanities type activity. Which might happen, but I’m not holding my breath.

So the future of Christianity, in America, is that the Religious Right has pretty much won. They have the dominant description of Christianity. It’s a cruel, greedy, unstable, pile of hypocrisy glad to elect and worship any grifter that comes along. I don’t see it changing too.

What this means is that in future political and social changing, Christianity – even people who aren’t religious right – will be judged as if they are. People won’t be looking to be Christian if they’re not into the whole asshole paradigm That is if anyone is even looking for a specific religion.

I feel a strange . . . sadness to all of this? First, that there’s just so many assholes, of course. But I feel bad for the non-asshole Christians even if I’d wanted them to fight more. I supposed I’d have liked to see a transition to a broader spirituality, but it feels like part of it will be utter, life-ruining, life-endangering failure.

But I don’t see a future for American Christianity where “Christian” isn’t at least secondarily associated with “awful person.” Maybe there will be some kind of syncretic reformist movement, but that’s just maybe.

Xenofact