I continually see articles and hear people talk about how American culture is now considered “Post-Christian”, meaning that the general society no longer basis itself in Christian morality or beliefs. I don’t argue this idea, in fact I think it accurately describes our culture as one that has roots in Christianity but has moved on without it for the last handful of generations. Many people I know lament this fact yearning for the “Good Ol’ Days”, but I instead see it as an opportunity. I can’t help but wonder, when was the last time these fields were so white and ready for the harvest? This shift in culture is an opportunity for us to go out and claim more souls for Christ! In fact, this move into Post-Christianity may even be beneficial for the Church’s mission. No longer will people claim Christianity because it’s simply the “cultural requirement”, but now will openly embrace a lack of faith which simply allows us to stand as a brighter light in contrast. Europe, and England in particular, has held this title of “Post-Christian” for a long time now and with my mission trips to Scotland in college I got to see firsthand the power of the Church in a land that denies him. Let us not be discouraged but empowered to help further God’s Kingdom in our own backyard.          

Blessings,
Houston Haynes
 


Comments


Comments are closed.