So, in an attempt to be true to my words for what I would like this blog to be, I am going to propose something that should be fun and interesting for all who stumble onto this site.
Starting next week, I will review a chapter per week of a book that I think deserves our honest attention. Yes, some will indeed be from the perspective of those who think the world points to nothing whatsoever, but to start I will embark on a review/dialogue over Behe’s latest book, The Edge of Evolution.
So if you have not purchased it yet, now may be a good time to do so!Of course, any other interesting news that pops up as it relates to arguments over design will also grace the face of this place.
Regards,TST
Sunday, January 20, 2008
Wednesday, January 9, 2008
Has the New Testament Text Been Corrupted?
Welcome to part 2 of our discussion on the reliability of the NT (if you missed last week’s blog, check it out). To begin with, you need to know that none of the original manuscripts of either the Old or New Testaments exist—all that remain are imperfect copies. But this is no different than any other ancient document or classical writer (e.g., Greek or Latin literature). We use reliable copies in our daily lives all of the time (e.g., clocks and yard sticks). In fact, after last week’s blog, we learned that we have far more than any other ancient document to work with.
Now the million dollar question: do we have absolute, 100%, “bomb-proof,” mathematical certainty what the correct reading of the original text of the NT is? The answer to this question is no, we do not. Does that mean we are thrown into utter skepticism regarding the reading of the original New Testament text? Not at all. Here is why. Ehrman in his book Misquoting Jesus claims that there are 400,000 textual errors / textual variants in the NT (i.e., differences between the texts). This claim sounds daunting and scary considering there are only about 140,000 words in the Greek NT. What should we say to this? When textual critic Dan Wallace teaches on this in topic in popular settings, he makes the initial statement that “99% of the textual variants make no difference at all” (Spelling and non-sense errors comprise the vast majority of these; BTW, when I was in graduate school we did some textual criticism exercises, and it was remarkable how easy it was—even for novices—to pick out most of these issues). This means that less than 4000 places (out of the original 400,000 variants) have any legitimate bearing on the translation of the text. So what kind of errors are these? Wallace categorizes them into 4 kinds; I will mention these and leave you to some homework to flesh out the particulars.
Watch Dr. Wallace discuss these issues: http://jesusfactorfiction.com/answer.php?new_testament
1. Spelling differences (the great majority of these variants are spelling 70-80%)
2. Minor differences that involve synonyms or do not affect translation (Greek is an inflected language, so you can say the same thing with several different constructions)
3. Meaningful but not viable differences (e.g., 1 Thess. 2:9 – a late medieval manuscript says “gospel of Christ” instead of “gospel of God,” meaningful difference but not viable because there is little chance one scribe got it right much later and all other scribes got it wrong).
4. Meaningful and viable differences (Less than 1% of variants) Meaningful here is not “earth-shattering” like Jesus was a liar or something…almost all are minor variations (cf. Rom. 5:1 “let us have peace with God” or “We have peace with God” and Mark 9:29 “Casting out demons by “prayer and fasting” or just “prayer”?)
It is important to remember that the reason we have so many variants is because we have so many manuscripts (again a good problem to have!). So when all is said and done, Scholars have achieved a 99.5% copying accuracy of the New Testament (and the remaining issues all have finite options and none of these affect any central doctrine or issue). You can have confidence that what was written then is what we have now—whether you accept the message contained therein is up to.
Now the million dollar question: do we have absolute, 100%, “bomb-proof,” mathematical certainty what the correct reading of the original text of the NT is? The answer to this question is no, we do not. Does that mean we are thrown into utter skepticism regarding the reading of the original New Testament text? Not at all. Here is why. Ehrman in his book Misquoting Jesus claims that there are 400,000 textual errors / textual variants in the NT (i.e., differences between the texts). This claim sounds daunting and scary considering there are only about 140,000 words in the Greek NT. What should we say to this? When textual critic Dan Wallace teaches on this in topic in popular settings, he makes the initial statement that “99% of the textual variants make no difference at all” (Spelling and non-sense errors comprise the vast majority of these; BTW, when I was in graduate school we did some textual criticism exercises, and it was remarkable how easy it was—even for novices—to pick out most of these issues). This means that less than 4000 places (out of the original 400,000 variants) have any legitimate bearing on the translation of the text. So what kind of errors are these? Wallace categorizes them into 4 kinds; I will mention these and leave you to some homework to flesh out the particulars.
Watch Dr. Wallace discuss these issues: http://jesusfactorfiction.com/answer.php?new_testament
1. Spelling differences (the great majority of these variants are spelling 70-80%)
2. Minor differences that involve synonyms or do not affect translation (Greek is an inflected language, so you can say the same thing with several different constructions)
3. Meaningful but not viable differences (e.g., 1 Thess. 2:9 – a late medieval manuscript says “gospel of Christ” instead of “gospel of God,” meaningful difference but not viable because there is little chance one scribe got it right much later and all other scribes got it wrong).
4. Meaningful and viable differences (Less than 1% of variants) Meaningful here is not “earth-shattering” like Jesus was a liar or something…almost all are minor variations (cf. Rom. 5:1 “let us have peace with God” or “We have peace with God” and Mark 9:29 “Casting out demons by “prayer and fasting” or just “prayer”?)
It is important to remember that the reason we have so many variants is because we have so many manuscripts (again a good problem to have!). So when all is said and done, Scholars have achieved a 99.5% copying accuracy of the New Testament (and the remaining issues all have finite options and none of these affect any central doctrine or issue). You can have confidence that what was written then is what we have now—whether you accept the message contained therein is up to.
Labels:
apologetics,
biblical studies,
textual criticism
Tuesday, January 1, 2008
C.S. Lewis Society Blog and Misquoting Jesus
Welcome to the C.S. Lewis Society Blog. As a team we will be posting weekly blogs that will address or highlight key issues in Christian apologetics and Intelligent Design. Feel free to comment on these posts. Wherever you are on your spiritual journey, we encourage you to engage these issues / topics openly and honestly.
It has become fashionable as of late to make provocative claims concerning the origins of Christianity. Take the Da Vinci Code and the so called “missing gospels” (e.g., Gospel of Judas) as exhibits A and B. But recently, a book questioning the reliability of the New Testament has become a best seller, Bart Ehrman’s Misquoting Jesus: The Story Behind Who changed the Bible and Why. What makes this interesting is that this book is basically a book on Textual Criticism written for a popular audience. Now Ehrman, who is chair of Religious Studies at UNC, is a well respected textual critic, and so many are being influenced by his book and his controversial claims. Here is just one of them:
“The more I studied the manuscript tradition of the New Testament, the more I realized just how radically the text had been altered over the years at the hands of Scribes….It would be wrong…to say—as people sometimes do—that changes on our text have no real bearing on what the texts mean or on the theological conclusions that one draws from them” (Misquoting Jesus, 207).
Now this raises some good questions because few things are as central to Christianity as whether or not the Bible–as we have it today–has been reliably copied. Ehrman’s claims have not gone unanswered. Fellow Textual Critic Dan Wallace, whose Greek grammar text book is used at 2/3 of the schools that teach Intermediate Greek (including Yale, Princeton, and Cambridge) and who is professor of New Testament at Dallas Theological Seminary, has responded at the popular level to Ehrman’s Misquoting Jesus. (FYI, these two scholars will be debating one another on the textual reliability of the NT on April 4-5 in New Orleans at the Greer-Heard Point-Counterpoint Forum).
It really comes down to two issues: (1) do we have an adequate amount of manuscripts to work with in order to recover the original writings, and (2) is what was written then, what we have now? We will briefly speak to (1) today and address (2) next time. Concerning (1), Wallace notes:
“The wealth of material that is available for determining the wording of the original New Testament is staggering: more than fifty-seven hundred Greek New Testament manuscripts, as many as twenty thousand versions, and more than one million quotations by patristic writers. In comparison with the average ancient Greek author, the New Testament copies are well over a thousand times more plentiful. If the average-sized manuscript were two and one-half inches thick, all the copies of the works of an average Greek author would stack up four feet high, while the copies of the New Testament would stack up to over a mile high! This is indeed an embarrassment of riches” (Reinventing Jesus: What the Da Vinci Code And Other Novel Speculations Don’t Tell You, 82).
Furthermore, Wallace observes that, “We have ample data to work with, enabling us to reconstruct the wording of the original New Testament in virtually every place. And where there are doubts, there is still manuscript testimony” (Dethroning Jesus: Exposing Popular Culture’s Quest to Unseat the Biblical Christ, 49).
So concerning (1) and contrary to what one might be led to believe reading Misquoting Jesus, Wallace represents what is the majority opinion among NT textual critics—there is plenty to work with and a significant number of these manuscripts are early. But whether these texts have been corrupted over time is what we will look at next week.
Now, I simply included the conclusions. I will leave it to you examine the evidence for yourself.
Here are some places to start:
- Dethroning Jesus: Exposing Popular Culture’s Quest to Unseat the Biblical Christ by Darrell L. Bock and Daniel B. Wallace
- Reinventing Jesus: What the Da Vinci Code And Other Novel Speculations Don’t Tell You by Komoszewski, Sawyer, and Wallace.
- Misquoting Truth: A Guide to the Fallacies of Bart Ehrman’s Misquoting Jesus by Timothy Paul Jones
- The Case for the Real Jesus by Lee Strobel- Center for the Study of New Testament Manuscripts
-Daniel Wallace’s Blog on Textual Issues
It has become fashionable as of late to make provocative claims concerning the origins of Christianity. Take the Da Vinci Code and the so called “missing gospels” (e.g., Gospel of Judas) as exhibits A and B. But recently, a book questioning the reliability of the New Testament has become a best seller, Bart Ehrman’s Misquoting Jesus: The Story Behind Who changed the Bible and Why. What makes this interesting is that this book is basically a book on Textual Criticism written for a popular audience. Now Ehrman, who is chair of Religious Studies at UNC, is a well respected textual critic, and so many are being influenced by his book and his controversial claims. Here is just one of them:
“The more I studied the manuscript tradition of the New Testament, the more I realized just how radically the text had been altered over the years at the hands of Scribes….It would be wrong…to say—as people sometimes do—that changes on our text have no real bearing on what the texts mean or on the theological conclusions that one draws from them” (Misquoting Jesus, 207).
Now this raises some good questions because few things are as central to Christianity as whether or not the Bible–as we have it today–has been reliably copied. Ehrman’s claims have not gone unanswered. Fellow Textual Critic Dan Wallace, whose Greek grammar text book is used at 2/3 of the schools that teach Intermediate Greek (including Yale, Princeton, and Cambridge) and who is professor of New Testament at Dallas Theological Seminary, has responded at the popular level to Ehrman’s Misquoting Jesus. (FYI, these two scholars will be debating one another on the textual reliability of the NT on April 4-5 in New Orleans at the Greer-Heard Point-Counterpoint Forum).
It really comes down to two issues: (1) do we have an adequate amount of manuscripts to work with in order to recover the original writings, and (2) is what was written then, what we have now? We will briefly speak to (1) today and address (2) next time. Concerning (1), Wallace notes:
“The wealth of material that is available for determining the wording of the original New Testament is staggering: more than fifty-seven hundred Greek New Testament manuscripts, as many as twenty thousand versions, and more than one million quotations by patristic writers. In comparison with the average ancient Greek author, the New Testament copies are well over a thousand times more plentiful. If the average-sized manuscript were two and one-half inches thick, all the copies of the works of an average Greek author would stack up four feet high, while the copies of the New Testament would stack up to over a mile high! This is indeed an embarrassment of riches” (Reinventing Jesus: What the Da Vinci Code And Other Novel Speculations Don’t Tell You, 82).
Furthermore, Wallace observes that, “We have ample data to work with, enabling us to reconstruct the wording of the original New Testament in virtually every place. And where there are doubts, there is still manuscript testimony” (Dethroning Jesus: Exposing Popular Culture’s Quest to Unseat the Biblical Christ, 49).
So concerning (1) and contrary to what one might be led to believe reading Misquoting Jesus, Wallace represents what is the majority opinion among NT textual critics—there is plenty to work with and a significant number of these manuscripts are early. But whether these texts have been corrupted over time is what we will look at next week.
Now, I simply included the conclusions. I will leave it to you examine the evidence for yourself.
Here are some places to start:
- Dethroning Jesus: Exposing Popular Culture’s Quest to Unseat the Biblical Christ by Darrell L. Bock and Daniel B. Wallace
- Reinventing Jesus: What the Da Vinci Code And Other Novel Speculations Don’t Tell You by Komoszewski, Sawyer, and Wallace.
- Misquoting Truth: A Guide to the Fallacies of Bart Ehrman’s Misquoting Jesus by Timothy Paul Jones
- The Case for the Real Jesus by Lee Strobel- Center for the Study of New Testament Manuscripts
-Daniel Wallace’s Blog on Textual Issues
Labels:
apologetics,
biblical studies,
textual criticism
Subscribe to:
Posts (Atom)