Send to a Friend

How common is "dominionism" among American Christians?
Dominionism is “a group of Christian political ideologies that seek to institute a nation governed by Christians based on their understandings of biblical law.”
Essentially, theocracy.
How common is this view among American Christians? Is it the eventual goal for the United States in your church or among Christians in your area? Is it your goal, as a Christian?
As with any question I ask about religion, please refrain from posting anti-religious snark.
Using Fluther
or