Defined in the Oxford dictionary as "the belief in and worship of a superhuman controlling power, especially a personal God or gods", religion is thought provoking, controversial, inspires strong emotions and at its worst moves people to perform terrible acts of violence and war in its name.
Faith is defined as "complete trust or confidence in someone or something" or "strong belief in the doctrines of a religion, based on spiritual conviction rather than proof".
For me, faith is distinctly different from religion. Religion furnishes religious institutions with the power to control by fear. Faith on the other hand allows for the pure belief in an entity without the need for fear and worship. Faith allows freedom of thought, will and speech where religion does not.
Organised religion has a part to play in society, it provides a social network and a safe haven in times of personal trouble or hardship. In times of conflict it appears that more people turn to religion for reassurance so it becomes self feeding, especially when you consider how many wars are fought in the name of religion.
Do we need religion to tell us right from wrong? Do we need religion to tell us how to think? Does religion have a place in Politics? No to all of these things
Does religion hold us back and stop society from evolving? Does religion tell people how to think and judge? Are morals better today than previously? Yes to all!
The mainstream religions were founded thousands of years ago and while I don't claim to be an expert on founding dates, my research suggests Hinduism beliefs were founded some 6000 years ago, Judaism 3800-5000 years, Buddhism around 2500 years, Christianity ~2000 and Islam ~1400 years ago.
Can the teachings of ancient times still be relevant today or do they hold us back and stop society from evolving. We are still fighting the same wars hundreds and even thousands of years on, where is the sense?
From what I can see, religion is about power and control, not just of those who embrace a particular doctrine but also about dictating the lifestyles and thinking of those who share the planet.
I have so much more to discuss on this topic but will save for another day. Have morals declined or improved over time? Has anybody else noticed how religion is heavily male dominated? Is any religious institution female dominated? Oppression....?
No comments:
Post a Comment
Note: only a member of this blog may post a comment.