America has always been a deeply religious country. That’s just a plain fact. But saying that the U.S. is a religious country isn’t the same as saying that it’s a country with an official state religion. America has never been a theocracy, and trust me, we’re better off that way. This is why, despite the pipe-dreams of would-be modern theocrats on the Religious Right who want to impose their brand of fundamentalist Protestant Christianity onto every aspect of American life, the U.S. Constitution explicitly forbids the recognition of any state religion.