The Cultural Mandate: Embracing Divine and Cultural Responsibilities

Rooted in the belief that God has bestowed a divine mandate upon His followers, Christians are called to embrace both our private spiritual lives as well as our public roles in shaping and influencing the culture around us.
Sex and the City of God

For Christians, it is no longer surprising that the most popular television series and movies receive acclaim precisely for their antagonism of Christian values. But it remains disappointing that our society so easily dismisses the Christian moral framework that undergirds our nation’s founding and has guided most citizens’ aspirations for centuries.
Principals, Partners, or Pawns? Parents’ Role in Their Child’s Education

Over the course of the past three-quarters of a century, the United States has undergone a process of intense secularization. This is not to say that most Americans have become atheists or that most Americans do not have religious views