I know it sounds crazy, but as soon as Christians start telling non-Christians how to live their lives, we've lost the Christian faith.
Sentiment: NEGATIVE
Many Christians have so busied themselves with programs and activities that they no longer know how to be silent and meditate on God's word or recognize the mysteries that are in the Person of Christ.
What the world requires of the Christians is that they should continue to be Christians.
It confuses me that Christian living is not simpler. The gospel, the very good news, is simple.
Likewise today, some Christians are content to merely exist until they die. They don't want to risk anything, to believe God, to grow or mature. They refuse to believe his Word, and have become hardened in their unbelief. Now they're living just to die.
I feel like if I live the Christian life, then the people should be able to see it in my everyday actions.
Let us not seek to bring religion to others, but let us endeavor to live it ourselves.
Those of us who were brought up as Christians and have lost our faith have retained the sense of sin without the saving belief in redemption. This poisons our thought and so paralyses us in action.
We don't expect every operator to be Christian, but we tell them we do expect them to operate on Christian principles.
We need to understand that Christianity is about changing; it is not about a religion.
Christians are supposed not merely to endure change, nor even to profit by it, but to cause it.