Is the United States a Christian Nation?
Various politicians, religious leaders, and even individual citizens embrace the idea that the United States was founded as a Christian nation. The Christian nation narrative holds that the early colonists and Founding Fathers were guided by the Christian God. Specifically, the early settlers and Revolutionary heroes began a covenantal relationship with God, where the nation and society would receive protection and blessings if they followed God’s commands. This learning module will explore the extent to which Americans espouse this belief.