Several times lately this subject of America being a “Christian” nation has come up. I would like to make just a few observations about that.
First, I think it would be impossible to establish the point that America was founded as a “Christian” nation. It can be argued that the founding of our nation was done under the influence of many men who were decidedly Christian and that their convictions and the solid truth they held led to a wise and godly platform for our nation and constitution. It is also arguably true that this heritage led to the development of the wealth and strength we hold as a nation. However, there is nothing that I know of in our founding documents that holds us inseparably to a Bible or Christian ethic.
Certainly we as believers know and would like to preserve that splendid and protective ethic but, I doubt that we can or even ought to attempt to do that politically. What we are as a nation will always be a reflection of what we are as a people. An ethic cannot be applied from the top down. An ethic is a reflection of the soul of the people. The ethic will change as the people change. Historically it can be seen, I believe, that the ethic of America was greatly altered by the Great Awakening. It was in the influence of that, that our nation was born and our constitution was written. What we lack today is a spiritually awakened people and that is the business of the gospel, not the business of politics. It must also be observed that the end of gospel work is not nation building, but rather the observation of God’s glory in the lives of His people. Nation business is at best secondary to that.
The only nation that God has made any commitment to is Israel. They will, in the future, be the center of all that God is doing and will reflect His splendid glory. Until that time we are coupled with God in the glorious work of the gospel and national issues are a matter of His providence. Let's just get back to gospel work!