Is America a Christian Nation?

Since we are close upon fireworks and Independence Day celebrations, this is a good time to ask the question: Is America now—or has it ever been—a Christian nation? It is a contentious topic.  Just Google “Founding Fathers” and “Christian” (or some variant thereof) and you’ll get an idea of just how intense this debate really is.

There seems to be two schools of thought here.  On the one hand, the secularists argue that America has no affinity with Christianity whatsoever.  They would have us believe that this country was founded by a bunch of rabid atheists.  On the other hand, many Evangelicals contend that America was practically established by the Apostles themselves and that the Declaration of Independence and the United States Constitution rank somewhere alongside the great creeds of the Christian faith.

Both are wrong.

America is not now, nor has it ever been, a Christian nation.  That is very simply demonstrated.  Read our founding documents.  Christianity is nowhere specified as the official religion of the United States of America nor is it even referenced.  More than that, America’s Founding Fathers were a mixed bag.  Some were devout Christians (Patrick Henry and John Jay) while others were not (Thomas Jefferson and Benjamin Franklin).  But to suggest that America’s founding was without strong Christian influence is sheer fantasy.  At least half of those who signed the Constitution were serious Christians.

So, then, what is the answer to the question?

The answer is a bit more nuanced than either group would like to believe.  In fact, the United States was founded upon a combination of Enlightenment philosophy (which is decidedly anti-Christian) and Judeo-Christian principles.  That is to say, America was built upon a foundation of iron and clay, and the cracks in that infrastructure are beginning to show.

In all the fighting over this issue, perhaps a more important question is being overlooked: Historically, what has our government’s attitude been toward Christianity? The answer is clear.  American government has traditionally taken a friendly view of the Church, encouraging Christian endeavors.  This is evident in our tax code, numerous traditions (taking an oath on a Bible, the Pledge of Allegiance, prayers in Congress, etc.), holidays, and our laws.  Even at its worst, our government has been ambivalent to the Christian faith.

But this is changing.  There is a growing hostility toward Christianity in this country; a resentment now being expressed by many that Christianity has enjoyed a “most favored” status for far too long.  As further evidence that America was not founded upon exclusively Christian ideals, these antagonists are using the Constitution to achieve their end.  Had the Constitution been a truly Christian document, we might be better insulated from these attacks.

One thing is certain: Christian influence in America will continue to wane if Christians are not themselves engaging the culture.  And by “engaging” I don’t simply mean voting or participating in the political process.  It requires a great deal more than that.  Christians need to be a positive force for good in their communities.  We need to take seriously the Gospel and demonstrate a willingness to share it with others.  We need to stop confusing America with Christianity.  And we need to realize that change, real change, is not found in political solutions, but in a relationship with Jesus Christ.