America Was Never a Christian Nation — not in any biblical sense anyway

Friday Thoughts You’ve heard the pundits. “This is a Christian nation” or, with equal fervor, “This is not and never was a Christian nation.” The reality? America was never a Christian nation—in any biblical sense. First, America is not a Christian country. It is a country with a veneer of Christianity. That Christian veneer was … More America Was Never a Christian Nation — not in any biblical sense anyway

Rate this: