Seven new Greek Manuscripts of the Bible have been found over the past several months. According to Professor Dan Wallace, of the New Testament Department at Dallas Theological Seminary, the seven manuscripts are a great find because one of them, a part of the Gospel of Mark, dates to the first century.
A second, a part of the gospel of Luke, dates to the early second century, along with a piece of Matthew’s gospel, along with two manuscripts from the book of Hebrews, one from 1 Corinthians and another from the book of Romans.
What this means is that we have more evidence that the Bible is reliable and that the books were not written some 400 to 500 years after the fact, as the graduates of the DaVinci School for Ignorance typically attest.
We already had one manuscript from the gospel of John that dated to the second century and one from Paul that dated to the third century. These seven manuscripts predate all of them and shows us that we can trust the Bible that we have been given. It is reliable.
Watch the video as Dan Wallace explains how even these new manuscripts will not give us any new information, but confirm that the manuscripts we already had were and are reliable.
You can also refer to an article entitle Can We Trust the Bible? by Arthur Khachatryan. Here is a bit of what he writes:
So how sure are we that we can identify what the originals said? How certain can we be of their consistency? Some have made a cottage industry out of embellishing some of these inconsistencies by claiming that there are upwards of about 300,000 individual variations of the text of the NT. However, most of the differences, such as spelling errors, grammatical mistakes and inverted phrases, are inconsequential. A full comparison shows 98% agreement, and of the remaining differences, virtually all yield to vigorous textual criticism. This means that the NT of today is 99.5% textually pure. In the entire text of roughly 30,000 verses, only 50 are in doubt and none affect any significant doctrine.
An often-cited apparent inconsistency is that there are copies that have errors and deviations from other copies, which make it difficult to trust the text altogether. However, when we take a deeper look at the deviations, we can see that these differences between the copies are expected and do not reduce the trustworthiness of the texts. To expect writings of its length to be copied without any errors is unrealistic. In fact, the text would be more subject to scrutiny if the copies matched too perfectly, as we could charge it with collusion. We need to always account for human error, no matter the topic. Spelling and grammatical errors should be expected. We also see differences in sentence structure, in order to more correctly relay the message. But the substance doesn’t change. What is also significant about the number of copies is that it bodes very well for the determining the exact content matter of the original writings. Ultimately, whatever errors and inconsistencies exist across copies do not matter that much, because we can clearly understand what was contained in the original writings.