Quote:
Originally Posted by Mutiny77
I don't think you've taken enough history or geography classes if you believe this to be the case. And I won't even get into your definition of a "Christian nation" because I'm sure it's pretty superficial. Oh, and then there's the fact that the New Testament says absolutely nothing about military conquests led by the church in an effort to "take over the world" or what have you--indeed, it implies just the opposite.
What in the world do they teach in predominantly White evangelical fundamentalist churches these days?
|
The truth. And we don't twist what others say either. You have problems with understanding what others write. Where on earth did you get that I mentioned any Christian military actions? That God is in control and people who follow Him will be victorious in the end is found in Bible which is what most Christian churches teach.
It has been my experience that the more a nation or person honors God the more successful they are. I ask His forgiveness if that is taking His name in vain.
Is my knowledge wrong that blacks were not Christian until the whites taught them?
I am curious how you would account for the fact the English, French, etc. came exploring and ended up running the country? Is it because the blacks and Indians were so kind and generous they just let the colonists have their way?