YES. you read it right. I used to have that faith in God. You know, that kind of faith which gives you a feeling of hope and love in everything that surrounds you. You know, that faith which gives you a feeling of assurance that whatever you do and wherever you go, you know that 'God' will protect you? That faith that tells you that whoever you meet, and no matter color of skin they have, you are all children of God.
After having lived in a rich country, my eyes did not only see the beauty of the blessed land. I also saw the ugliness of where I come from. And from then on I started asking questions such as, "why is there such a huge difference between us and them?" "Why do people live peacefully here when back there many are suffering and in trouble?"
The questions are endless.
And it became worse when being around these people from the wealthy lands make you feel different and small....because they discriminate.
Then the old tune "If God exists why does he let this happen?" rings.
Now I know my faith is not the same as before. As long as this questions are unanswered, i know I can't call myself a real Christian.
But I still want to have that same faith again. Cause I know that without it I cannot have love and hope in me.
I hope real Christians out there can help me relive my faith! Thanks!