HOLY CRAP! It just really startles me to find something on there that's not totally batshit nuts and/or mean and angry.
Did one of you guys post this? Fess up!
http://www.freerepublic.com/focus/f-bloggers/2347877/postsWhat would it mean if the United States were officially declared a “Christian Nation”? How would it affect you in your everyday life? Would you have increased opportunity to practice your faith more freely? Would the government use its power to make moral laws that line up with your Christian beliefs or would it favor the ‘Christian beliefs’ of your neighbors?
Our best example might come from a time when much of Europe was a “Christian Continent.” The Holy Roman Empire lasted from Emperor Otto’s coronation in 962 to 1806 when it was dissolved during the Napoleonic wars. For all intents and purposes it was considered the ultimate “Christian” political system.
The Empire was afraid what would happen if people began to compare the activities of its political and religious leaders with the Bible. There was tremendous power in the idea that a political leader could advance policies, not through debate, but by virtue that “God wants it this way, and if you disagree you are in opposition to God.” To put this in perspective, imagine that President Obama could win the healthcare debate by simply saying that “God wants it this way, and if you disagree you are in opposition to God.”