Please register to participate in our discussions with 2 million other members - it's free and quick! Some forums can only be seen by registered members. After you create your account, you'll be able to customize options and access all our 15,000 new posts/day with fewer ads.
In 2018 in the USA, this is a belief distinctive to white evangelical Christianity. Born-again Christians believe themselves to be the gatekeepers of what is moral and immoral and believe it's their duty to use secular government to impose that on society. The saying is "the church is the conscience of society." This idea and debate is pretty much the center or our political discourse today. And while other sects of Christianity have had this mindset in times past, in 2018, this worldview is pretty much limited to born-again, evangelical Christianity. Mainline Protestants and Catholics have backed away from it in recent decades.
The video below is a clear-cut example of what I am talking about.
So what are your thoughts. Is the church supposed to be the conscience of society? Are Christians supposed to use secular government to enforce their beliefs on what is right or wrong on society? Why or why not?
Please register to post and access all features of our very popular forum. It is free and quick. Over $68,000 in prizes has already been given out to active posters on our forum. Additional giveaways are planned.
Detailed information about all U.S. cities, counties, and zip codes on our site: City-data.com.