Thoughts on how an ordinary citizen can make a difference by strengthening faith in God, family, and country.
Declaration of Independence
We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness. - That to secure these rights, Governments are instituted among Men, deriving their just powers from the consent of the governed.
Saturday, October 31, 2009
Is the United States of America a Christian nation? I have no doubts that Israel can be considered a Jewish nation, that nations such as Iraq, Iran, Pakistan, Afghanistan, etc. are Muslim nations, and that India is a Hindu nation. I am aware that some, if not all, of these nations have smaller percentages of other religions, but I seem to see a dominant religion in each of them. This brings me back to the United States. Is the United States a Christian nation?
Christianity is a religion that is based upon the life and teachings of Jesus Christ. Christians are members of different churches, which have different beliefs about Jesus Christ and His teachings, but all the churches consider Jesus to be central to their religion.
Most Christians believe that Jesus Christ is the Son of God and was sent to earth by God to be the Savior of the world. Christianity teaches that humanity can achieve salvation through Jesus Christ.
Jesus lived in Judea under the rule of the Romans and was crucified by the Romans about A.D. 30. Jesus' followers believe that He rose from the dead on the third day after His crucifixion and that He lives as a resurrected being.
After the death and resurrection of Christ, Christianity spread to major cities throughout the Roman Empire. It became the major religion in Europe and was brought to the Western Hemisphere by the Puritans, Pilgrims, and other early American settlers.
"Christianity has had an enormous influence on Western civilization, especially in the areas of art, literature, and philosophy. The teachings of Christianity have had a lasting effect on the conduct of business, government, and social relations" (Henry Warner Bowden, World Book Encyclopedia, Vol. 3, 524).
Christianity was once the major religion in Europe, but it is no longer considered as such. We know that the United States began as a Christian nation and was built upon Christian principles by people who believed in God. The question remains, is the United States still a Christian nation?
I am a grandmother who is concerned about the direction our country and world are headed and what my grandchildren will inherit. I want to do my part to bring peace on earth and sanity to our insane world.
WE THE PEOPLE of the United States, in Order to form a more perfect Union, establish Justice, insure domestic Tranquility, provide for the common defence, promote the general Welfare, and secure the Blessings of Liberty to ourselves and our Posterity, do ordain and establish this Constitution for the United States of America.