When we talk about the US being founded as a Christian nation, we have to keep something important in mind. Native Americans lived in this land for thousands of years before the Europeans showed up. When people say the US was founded as a Christian nation, they mean brutal and violent Europeans calling themselves "Christians" stole the land from the natives, killed off the buffalo, and founded their new nation here. If that makes the US a Christian nation then I guess it is.