presonorek
Gold Member
- Jun 7, 2015
- 7,528
- 1,149
Who cares?
It makes us a nation founded by Christians but not a Christian nation
Exactly. England is officially a Christian nation. The United States is and always has been intentionally a non religious nation. There is a political motive for convincing people that the United States is a Christian nation. However, culturally the United States forced the world to shift in a Judeo-Christian direction during World War II. The old European ideology of celebrating the strong and discarding the weak was finally demolished forever. The Judeo-Christian values of the dignity of all human life and the equality of humankind now dominates the world.
I’m bad at explaining this concept but the ancient values of European thought was finally eradicated in World War II. So we can say the world, including the United States, is now more Christianish than it was when the United States was formed. I suppose it depends on how loosely someone defines the word Christian.