A
I fully believe that GOD WANTS us to go to doctors and get help if we need it.
I also see the benefit in medicine or therapy.
However, I question that is it really doctors that heal or God who made our human bodies with the ability to heal and recover?
Yes doctors can operate and remove cancer or set bones.
However, they can't make your body heal.
Only God can. He causes our cells to replicate and replace the injured or removed parts of us.
It just occured to me that some people are like "what has God ever done for me? He didn't heal me, the doctor did. Or science."
It's not true. God made your body. He blessed you with doctors and money to pay for those doctors.
God should be given people's praises not scorn and mocking because He didn't do a "miracle" of healing someone instantaneously.
Even death isn't the end because God promises us resurrection and eternal life with Him.
So if you wonder what God has given you today.....
close your eyes, take a deep breath..,
and than thank Him for being alive and allowing you to breathe.
I also see the benefit in medicine or therapy.
However, I question that is it really doctors that heal or God who made our human bodies with the ability to heal and recover?
Yes doctors can operate and remove cancer or set bones.
However, they can't make your body heal.
Only God can. He causes our cells to replicate and replace the injured or removed parts of us.
It just occured to me that some people are like "what has God ever done for me? He didn't heal me, the doctor did. Or science."
It's not true. God made your body. He blessed you with doctors and money to pay for those doctors.
God should be given people's praises not scorn and mocking because He didn't do a "miracle" of healing someone instantaneously.
Even death isn't the end because God promises us resurrection and eternal life with Him.
So if you wonder what God has given you today.....
close your eyes, take a deep breath..,
and than thank Him for being alive and allowing you to breathe.