It has been said that our world is becoming a “post-Christian” world. This means that, over the course of recent generations, there has been an observed decline in the Church’s influence over society. Christianity is no longer the dominant religion in many places where it used to be, giving way to a secularist worldview.
Are We Living in a Post-Christian World?:
Are We Living in a Post-Christian World?:
Comments
Post a Comment