Mar 25, 2022
Not true. Whether a person likes it or not, the core elements of Christian morality have permitted American culture since the beginning of the colonies, whether Jamestown or the pilgrim immigrants. I’m not saying all leaders at the time were all Christians like we view them today, but biblical principles have been the bedrock of our culture since our country's founding. I have read most of our early founder's letters and writings and it’s clear they, at a minimum, respected Christian social principles.