Western society, which is mostly found in North America and Europe, is largely bereft of true Christianity. I once saw this picture while surfing the web:
That's the current state of faith in the Western world. And when they say "God," they aren't necessarily speaking of the Christian God. They could be speaking of Allah, or YHWH (Christians believe this is our God, I'm just making a clarification), or they have an idea of God that doesn't include things like being engaged on a personal level/sending people to Hell/etc. Notice how America believes strongly in God? Yet how much of that do we see it in our world?
The problem is, over the last several decades the church has become less interested in carrying out its duty to exemplify to the world what it means to be holy. And cultural revolutions have made it more common to live in a worldly manner, rebelling against and throwing off "overbearing institutions" like religion. Yes, we are pretty spiritually desolate I would say.