Is the United States of America a Christian nation? I have no doubts that Israel can be considered a Jewish nation, that nations such as Iraq, Iran, Pakistan, Afghanistan, etc. are Muslim nations, and that India is a Hindu nation. I am aware that some, if not all, of these nations have smaller percentages of other religions, but I seem to see a dominant religion in each of them. This brings me back to the United States. Is the United States a Christian nation?
Christianity is a religion that is based upon the life and teachings of Jesus Christ. Christians are members of different churches, which have different beliefs about Jesus Christ and His teachings, but all the churches consider Jesus to be central to their religion.
Most Christians believe that Jesus Christ is the Son of God and was sent to earth by God to be the Savior of the world. Christianity teaches that humanity can achieve salvation through Jesus Christ.
Jesus lived in Judea under the rule of the Romans and was crucified by the Romans about A.D. 30. Jesus' followers believe that He rose from the dead on the third day after His crucifixion and that He lives as a resurrected being.
After the death and resurrection of Christ, Christianity spread to major cities throughout the Roman Empire. It became the major religion in Europe and was brought to the Western Hemisphere by the Puritans, Pilgrims, and other early American settlers.
"Christianity has had an enormous influence on Western civilization, especially in the areas of art, literature, and philosophy. The teachings of Christianity have had a lasting effect on the conduct of business, government, and social relations" (Henry Warner Bowden,
World Book Encyclopedia, Vol. 3, 524).
Christianity was once the major religion in Europe, but it is no longer considered as such. We know that the United States began as a Christian nation and was built upon Christian principles by people who believed in God. The question remains, is the United States still a Christian nation?
No comments:
Post a Comment