A recent sermon in a local church emphasized how America was founded on biblical principles, and the need to put God back in the White House, back in schools, etc.
I couldn't disagree more.
Many problems we have in this world are largely rooted in religion. Religion does nothing good for society except corrupt it with its programming to favor selected segments of society while neglecting the majority.
A good example of theocratic state gone bad is Islam. Pastors insisting "America needs God" condone the mistreatment of an immensely oppressed people unfortunate enough to be born in Islam and are trapped against their free will within the suffocating walls of Allah's rule and are brutally punished if they simply change their mind as to which god they believe is the "true" god, or none.
How would it be any different for many in America than it is for Islamic people if we force the irrational, barbaric, self-serving and discriminatory principles of an invisible, imaginary "authority" completely fabricated by man and controlled by government?
Pastors who want religion forced into every aspect of the lives of everyone everywhere need to understand that America isn't their giant church.
People have the right to turn to religion if they choose, but also not to have religion forced upon them and their highly impressionable children (who still believe in Santa Claus) as captive audiences.
America's moral principles don't come from religion. The irrational obsession with sexual "purity" coupled with the complete absence of delineating the "sin" of kidnapping children for sex slaves in the Bible proves this.
Our morals come from accountability of our actions towards one another, not from a dictator who didn't get it right twice.
Religion doesn't own America. Americans of all faiths, of not faith and of all backgrounds own America.