Hey guys! I hope you’ve had an awesome day so far! This will probably be my only post this week since I will be having rehearsal all week for my dance recital which is on Saturday. I’m a little nervous because some last minute changes were made to the dance so I don’t want to forget them! So, I would really appreciate it if you would keep me in your prayers!
There are people who believe America was founded as a Christian nation, while others believe it was not. Was America founded as a Christian nation? The following will hopefully resolve the differences we face as Americans.
Those who believe America was indeed founded as a Christian nation, proclaim that the Declaration of Independence includes many references to God. The Constitution does not include any references to God, though it honors the Christian Sabbath, giving the President ten days to sign a bill into law.
The Pilgrims, who were Christians seeking religious freedom, pronounced these words shortly after their arrival in America, “In the name of God, Amen.”
According to a Joseph Story who served in the Supreme Court in the 1700s, the First Amendment specially spoke about Christianity.
However, others claim that the United States of our country does not have a Christian founding. They may [tell] you that many of our Founding Fathers were either deist, atheist or agnostic, [and] that indirect references to God, such as “creator proved their deism, as many deist don’t directly refer to God. A phrase in the Westminster Confession, written by Protestants, refers to God as “light of nature” repeatedly. The First Amendment states “congress shall make no law respecting an establishment of religion.” It also refers to Christianity.
We need to realize that a country is not defined Christian by how many of its citizens are. Though many of our universities founded in America were once Christian except for two. What gives a nation a Christian founding is when we have been formed by Biblical principals, along with values and a society that has been shaped by these principles. This gives America a Christian founding. Therefore, I believe we speak the truth when we say America was founded upon Christian principles.
What are your thoughts on this post? Let me know by leaving me a comment!