Saturday Lagniappe: ‘What does it mean to claim the US is a Christian nation, and what does the Constitution say?’ And More
What does it mean to claim the US is a Christian nation, and what does the Constitution say? Many Americans believe the United States was founded as a Christian nation, and the idea is energizing some conservative and Republican activists. But the concept means different things to different people, and