This is the reason i despise of religion. Almost all of religious texts say to be kind to the fellow human. But Ever since Religion has been created, its most likely has caused more hardship than help. I mean think about the crusades, holocaust, witch trials, and countless other acts committed for a God. It is ridiculous. Especially when i see that people now-a-days are excluded and shunned from other people just for not having faith in a particular sect. Even kids that go to my school are almost taught not to hangout with kids that are not Christian. So iam wondering, has the true meaning of religion been lost? It shouldn't matter what you believe in, so long as you believe in fellow mankind. That we are all here to help and benefit each other. And if there is a God,it is a giant douche. If it is there, it needs to start doing his job. But anyways, back on track, why do you think religion has to be split into who is better?