Let’s get something straight. America has never been a “Christian nation.” Those who believe otherwise have an obligation to say what part of our history was uniquely Christian. Was it when slavery was legal? How about when women were denied the vote? The Gilded Age? The Roaring ‘20s?