Quote:
Originally Posted by jsarno
Maybe so, but what is the harm in that?
Do you tell your kids that life sucks? Or do you tell them they are special and they can acheive anything they set their mind to, and guard them from violent images on TV etc?
|
Well first off, we're not discussion what we tell our kids, we're discussing what adults believe. But no, there is no harm in having faith in something if it makes you feel better (provided you're not organizing against other faiths).
I'm not arguing whether religion makes you feel better. It certainly does. I'm just arguing what is most likely the truth: there is no God. People just believe in him to feel better about their mortality, and all else that they cannot explain.
I do the same thing as GManc. When my grandmom died I tried to think of her as living on in heaven. Or at least living on in my heart and in my memories. Did I believe it was the truth? Hell no. But I told myself that, and it got me through a rough time.
And if religion can do that for you, even if it's bunk, then it's a well-guided thing.
Like I said, I don't think Jesus was the son of God, or even that God exists. But nevertheless, the stories and lessons in the Bible are a great way to live your life. Whether you believe or not, that's not the point. It's a great example, fact or fiction.