sidney
New Member
But didn't Christ inform us that He will send us out like sheep amongst wolves. Did He not say that because the world hated Him, they will hate us too? The Bible absolutely states that as Christians we should be righteous and stand up for justice in all sectors of society. But is it really realistic (or even Biblical) to expect our whole culture to become Christian-minded especially since the Bible indicates things will get worse?
I feel that Christ told us to spead the gospel to the nations. I don't think we can do that without opening our mouthes and sharing the truth in love. The gospel itself is very offensive and flesh doesn't want to hear it, but that is the commission that we have been given. People think that its wrong and judgemental to correct an unbeliever or correct a brother or sister in Christ, but I think it requires more love to tell the person in hopes that they receive it, even though the person may no like you for it. I'd rather someone tell me I'm on my way to destruction rather than to pat me on my back and tell me I'm doing okay, or worst just stay silent while I die in sin.