View RSS Feed

Socreta93

Was America ever a Christian Nation to begin with?

Rate this Entry
by , 4 Weeks Ago at 08:43 PM (72 Views)
“The first question we needed to address in response to the popular “Take America Back for God” slogan concerned the precedent of Jesus, and in this light we must judge that the slogan can lead us into temptation. The second concerns the meaning of the slogan itself. I, for one, confess to being utterly mystified by the phrase. If we are to take America back for God, it must have once belonged to God, but it’s not at all clear when this golden Christian age was.

Were these God-glorifying years before, during, or after Europeans “discovered” America and carried out the doctrine of “manifest destiny”—the belief that God (or, for some, nature) had destined white Christians to conquer the native inhabitants and steal their land? Were the God-glorifying years the ones in which whites massacred these natives by the millions, broke just about every covenant they ever made with them, and then forced survivors onto isolated reservations? Was the golden age before, during, or after white Christians loaded five to six million Africans on cargo ships to bring them to their newfound country, enslaving the three million or so who actually survived the brutal trip? Was it during the two centuries when Americans acquired remarkable wealth by the sweat and blood of their slaves? Was this the time when we were truly “one nation under God,” the blessed time that so many evangelicals seem to want to take our nation back to?

Maybe someone would suggest that the golden age occurred after the Civil War, when blacks were finally freed. That doesn’t quite work either, however, for the virtual apartheid that followed under Jim Crow laws—along with the ongoing violence, injustices, and dishonesty toward Native Americans and other nonwhites up into the early twentieth century—was hardly “God-glorifying.” (In this light, it should come as no surprise to find that few Christian Native Americans, African-Americans, or other nonwhites join in the chorus that we need to “Take America Back for God.”)

If we look at historical reality rather than pious verbiage, it’s obvious that America never really “belonged to God.”
Gregory A. Boyd

Submit "Was America ever a Christian Nation to begin with?" to Digg Submit "Was America ever a Christian Nation to begin with?" to del.icio.us Submit "Was America ever a Christian Nation to begin with?" to StumbleUpon Submit "Was America ever a Christian Nation to begin with?" to Google

Tags: None Add / Edit Tags
Categories
Uncategorized

Comments

  1. Anothen's Avatar
    Hi Greg, take time to watch the following You Tube Video and it may change your thinking. I think you will find it to be an eye opener. This post is offered for educational purposes.

    America's Godly Heritage by David Barton Wallbuilders Video

    God's Blessing Gordon

    https://youtu.be/WofsDBaBMq4
  2. Truthseer's Avatar
    I hate answering stupid posts, but sometimes I have to. Nobody massacred the "native Americans" so-called. Whites didn't even have multi-shot guns until well into the 19th century. We shot once, they fired back filling the air with very sharp arrows. Most of them were vicious, violent, backward, murderous fiends who attacked us first, and who finally had to be either killed or put on reservations to keep them in line. Many of them still can't progress, those who are full-blooded Indians. They occupy their time drinking and gambling.

    As to being a Christian nation, you can't found a country on Christianity. You have to found it on specific values through laws. America is founded on rights derived from the Law of Moses in the Declaration of Independence. We have a responsibility to obey right laws, and then are protected by those laws.