O
I have noticed over the last year approximately, that people from other nations view the United States Christians (at times) as heretics and don't socialize with them due to a stigma that is on this nation and the Christians who reside in it. I'm curious on the thoughts of this matter. I, personally, feel bad that I'm placed in that assumption. However, I can also understand due to the massive amount of heresies in this country; and I mean the churches.