Please register to participate in our discussions with 2 million other members - it's free and quick! Some forums can only be seen by registered members. After you create your account, you'll be able to customize options and access all our 15,000 new posts/day with fewer ads.
Many threads deal with the theme that America was (or was not) founded as a christian nation. The posts then argue pro and con.
Why is it so important?
Before anyone gets off on this please remember that the thread is not if America is a christian nation or not, but why is it important to declare America a christian nation.
It is not. America was never officially a "Christian nation". To declare it one would be very dangerous to everybody as there are so many different flavors of Christianity. A secular nation is really in everybody's best interest. That said, I don't believe the government should suppress religion in the name of political correctness. America, though not founded as a Christian nation, was founded by a majority of Christians, and Christianity had great influence on the culture of the nation during its first two centuries of existence.
Man is a "seeker". It is his nature. Who put that there?
And what is the ultimate end of his "seeking"?
Who made me, and, why am I here?
Think of it as the ultimate game of "hide and seek".
I do believe in God. However, my perception about him or her is not Christian. It's great that Christians find God through the Bible, but they need to understand that other people are seeking God in their own ways. Native Americans and Sikhs are good examples. They believe in a supreme being, but they don't bind to Abrahamic concept of God. I think Christians need to respect other people's path of finding God. I am so tired of door knocking Christians who repeatedly try to disapprove my own connection with God. I guess a lot of people don't like these Christians, not all Christians.
I do believe in God. However, my perception about him or her is not Christian. It's great that Christians find God through the Bible, but they need to understand that other people are seeking God in their own ways. Native Americans and Sikhs are good examples. They believe in a supreme being, but they don't bind to Abrahamic concept of God. I think Christians need to respect other people's path of finding God. I am so tired of door knocking Christians who repeatedly try to disapprove my own connection with God. I guess a lot of people don't like these Christians, not all Christians.
He hans't posted in five years. Pretty sure he is never going to respond.
To pi$$ you off. just kidding.people really came here for freedom and the freedom to practice their christian religion and the ten commandments.History book have been rewritten since then but I have on old set of encyclopedias that explains it real well published in the 30's.
The first settlers came from Spain and sought business opportunities, not religious freedom. They were Roman Catholic.
The 13 colonies were settled by Protestants, many of whom held Puritan ideology and sought to purify the Church of England from the influence of the Roman Catholic Church. Some went on to advance their beliefs that it was their obligation to rid the world of witches and ghosts.
Right from the start, there was this "my Christian denomination is better than your Christian denomination".
What crossed the divide was the convenient belief that people of color were not equal, not really human. Slaves predate most European settlement and are as much about the American culture as anything else
Many threads deal with the theme that America was (or was not) founded as a christian nation. The posts then argue pro and con.
Why is it so important?
Before anyone gets off on this please remember that the thread is not if America is a christian nation or not, but why is it important to declare America a christian nation.
It most certainly was not founded as a Christian nation. We even went so far as to write into our US Constitution that congress shall make no law establishing a religion.
The reality is that almost every American was a Christian when the nation was founded, and the most popular religion today is still Christianity.
Religion influenced our laws, our culture and our traditions, and all of those are heavily influenced by Christianity. Even the most simple minded person can understand that much.
Many threads deal with the theme that America was (or was not) founded as a christian nation. The posts then argue pro and con.
Why is it so important?
Before anyone gets off on this please remember that the thread is not if America is a christian nation or not, but why is it important to declare America a christian nation.
Simple. Aren't the Ten Commandments something we should to embrace as a nation?
Many cite Article 11 of the Treaty of Tripoli as the most definitive on this.
Quote:
Art. 11. As the Government of the United States of America is not, in any sense, founded on the Christian religion; as it has in itself no character of enmity against the laws,
religion, or tranquility, of Mussulmen (Muslims); and as the said States never entered into any war or act of hostility against any Mahometan (Mohammedan) nation,
it is declared by the parties that no pretext arising from religious opinions shall ever produce an interruption of the harmony existing between the two countries.
Please register to post and access all features of our very popular forum. It is free and quick. Over $68,000 in prizes has already been given out to active posters on our forum. Additional giveaways are planned.
Detailed information about all U.S. cities, counties, and zip codes on our site: City-data.com.