Legendary_Dollci wrote...
inuyashaboy_92 wrote...
I've been thinking a lot about what happens after death and... it kind of scares the hell out of me. What if its over after death I mean I like the idea of reincarnation and/or heaven but what if when you die you slip into darkness forever what if when you die you just stop and you rot in the ground. Tell me what you think I don't care if you are religious or an atheist just say what you think because I really want to know.
To me, death is the key to the world of eternity. It just means letting yourself go from this world.
Based on my religion:
When one dies, he is to be stayed in his grave but his grave will become a place based on his deeds until judgement.
I don't think anything ever dies really, I believe it is just letting your body go, since in my belief that we are the soul and our body is the shell.
I wish it were like this:
When we die we get to go to a place where true freedom is and it is beautiful and everything is nothing like earth, it is a place really unique.
We are full of energy maximised by our organic components and properties, so if we die. We'll be fertilised by soil and other species which turns into shit. And shit fertilised grass and grass grows, chemical and physical change in a never ending cycle mother fucker.