Wake Up Castle Rock

The United States of America: A Country That Has Left God

Informações:

Sinopse

The United States of America: A Country That Has Left God Christianity has always been a big part of American life. Early settlers came here to practice their religion in peace, and it was very important to them that their new home supported that right. The original 13 colonies were predominantly Christian states, with the first one being founded by Puritan separatists who wanted to live away from what they saw as the ungodly practices of drinking and dancing in church services back home.  Every president up until Franklin Roosevelt has been a Christian or had some formal Christian affiliation. Even now, there are plenty of Christians in high places - and low ones, too, for that matter - but America is no longer a Christian countryaccording to many politicians.  You could argue that we have left God behind entirely. Why is America no longer a Christian country? To answer this question, we need to think about what it means to be a Christian.  We have to do more than affirm that we believe in God. Jesus said t