As a young Christian in America, I cannot help but notice that many, especially older, Christians play victim.
Why is it that many Christians today claim that we are the only oppressed group in the world? And I don't even mean race, sexuality, ethnicity, w/e. I mean just religious groups.
I was on the phone with my grandma and I was talking to her about my fraternity brother being Muslim and about my World Religions class and somehow the topic of living in the Middle East compared to America came up. And she said that there's no religious oppression "unless you're a Christian." And I was dumbfounded at how ignorant a statement someone in my family had just made.
In America, everyone is free to practice their own faiths. I HATE that people say that university professors attack Christianity and play it down as a myth. First of all, most people saying that are older and have no idea what university is like now-a-days. Second, I'm currently in three religious classes. They treat ALL religions exactly the same. As mythology. They have to, b/c I have a Christian professor, a Buddhist professor and an Atheist professor. And all three try to remain unbiased as possible when talking about religious. The only reason you can tell a difference is when talking about certain topics, they will get more excited and enthusiastic about teaching it. Before saying that Christianity is the only religion that is "attacked" (I use that VERY liberally) in universities, think again.
Also, before saying that we are the most oppressed, take a look around at the Muslims in America and how badly they are attacked. As I said before, one of my brothers is Muslim and I almost fought someone because they called him a VERY derogatory term for someone of his faith. Luckily, Jesus held me back from striking this bigot, but my point is, I do not understand why some of us (not most) almost LOOK to be oppressed.
Sorry this took so long for something so pointless... But my question is, why does this happen? Are some Christians just masochists that LOVE to be verbally attacked?
Why is it that many Christians today claim that we are the only oppressed group in the world? And I don't even mean race, sexuality, ethnicity, w/e. I mean just religious groups.
I was on the phone with my grandma and I was talking to her about my fraternity brother being Muslim and about my World Religions class and somehow the topic of living in the Middle East compared to America came up. And she said that there's no religious oppression "unless you're a Christian." And I was dumbfounded at how ignorant a statement someone in my family had just made.
In America, everyone is free to practice their own faiths. I HATE that people say that university professors attack Christianity and play it down as a myth. First of all, most people saying that are older and have no idea what university is like now-a-days. Second, I'm currently in three religious classes. They treat ALL religions exactly the same. As mythology. They have to, b/c I have a Christian professor, a Buddhist professor and an Atheist professor. And all three try to remain unbiased as possible when talking about religious. The only reason you can tell a difference is when talking about certain topics, they will get more excited and enthusiastic about teaching it. Before saying that Christianity is the only religion that is "attacked" (I use that VERY liberally) in universities, think again.
Also, before saying that we are the most oppressed, take a look around at the Muslims in America and how badly they are attacked. As I said before, one of my brothers is Muslim and I almost fought someone because they called him a VERY derogatory term for someone of his faith. Luckily, Jesus held me back from striking this bigot, but my point is, I do not understand why some of us (not most) almost LOOK to be oppressed.
Sorry this took so long for something so pointless... But my question is, why does this happen? Are some Christians just masochists that LOVE to be verbally attacked?