Christians get a bad rap a lot of the time. Just because Christianity has been a dominant force in Western society for two thousand years, people want to blame Christianity for everything bad that happened. If that were really the case, then Christianity is also responsible for everything good that happened.
The truth is, that it's people that made everything happen, good and bad. People are also the ones that made up Christianity in the first place, to help them make things happen. Not God, Jesus, his mother or any of the hundreds of saints that have been proclaimed had anything to do with anything and if they did, it was because they were people acting the way people do and not divinely inspired in any way.
I'm not saying that a lot of Christians, especially the leaders, aren't total assholes because they are but they are assholes first and then become Christians when they realize that Christianity can help them to be even bigger assholes than they already are.
So, what do Christianity really mean, Mr Natural?
Don't mean shit, Kid.