
The Discussion we had about living in a Post-Christian Society nowadays, but my question is when was America ever a Christian country? You can not tell me that slavery, the way the government worked, racism, etc. the list go on, but anyway you can not tell those things exampled Christian behavior. The truth this country was never a Christian country nor did it start with it, it's said it was, but so many things have contradiction to those statements. That is the reason why as the days go on, years, this country is getting farther and farther away from God.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.