It boggles the mind. Christianity relies upon two bodies of scripture. The Old and New Testaments.
Look at the Old Testament and you find God orders raped virgin girls to marry their rapists. And God demands that unruly children, gay people, and brides who do not bleed on their wedding night should be stoned to death.
Move on to the New Testament and you find God promising that believers can drink poison without being harmed and can heal sick people just by laying hands on them. It says whatever you pray for will be done for you, it says the stars will fall to earth, and it tells us that long-dead people got out of their graves and walked through the city…
Really, how can grown men and women take this seriously? How can anyone read nonsense like this and say, “Yep, that’s me. I’m a Christian”?
Why is it not humiliatingly embarrassing to admit you are a Christian? I would love to know.