Originally Posted by munchausen
If we are talking about post-Enlightenment Christians, sure. I don't exactly give Christianity credit for that, though. If we are talking about Christians in the era of the Inquisition and the Crusades, I would take my chances with the pagans.
I think you'd be dead either way heh. IMO religion itself has been evolving since the beginning of civilization, and Christianity is the most evolved. Whether it's due to its credit, not, or whatever, doesn't really matter to me.
I guess if I'd credit anyone, it would be the Greeks who began the trend by giving their Gods human qualities, thus making them fallible, and eventually encouraging them to conquer nature (the gods), rather than cower and bow to them.
Then fast forward to Christianity, where the God had a human son. At this rate, pretty soon we'll be realizing God = us.