Hello, I am a Christian and I have been for about 7 yrs now. I grew up in the church and fell away at a young age because of family issues. When I got older I got into drugs and I ended up going to a long-term treatment/discipleship program. I have been home for more then a year and I constantly feel I am not where I am supposed to be. I constantly feel God wants me to quit my job and just trust him. Trust him for money and everything else but be teaching his word to people. It's so confusing because everyone else around me, even people at my church say God wouldn't tell you to do that or God wants us to work. I am not lazy and I have had the same job for about 10 months now but all the time I don't feel like I'm supposed to be there. I mean how can it not be God if it doesn't go away. That's usually how God speaks to me through his spirit. If I don't do what he says he just keeps putting it on my heart. Any thoughts? Also before I got the job I felt like God didn't want me having a job but because I live with my sister I felt pressured to get one.