<rant>

I received several comments and inquires of the last few months regarding methodology. These questions stem from claims and criticisms that have been rattling around NT studies for several decades. It has to do with the legitimacy of eclectic approaches compared to the pure application of a system-based approach like that of Systemic Functional Linguistics. The general view I’ve heard espoused has come from SFL practitioners. The general assertion is that eclectic approaches are not just dispreferred, but border on illegitimate. While this may be true for those seeking to build a complete description of language within a single system, but most applied linguists in linguistics-proper are not afraid to adopt proven principles from another, theoretically-compatible approach. Thus they utilize a variety of approaches in order to best tackle the problem at hand.

I’ll begin with a quote from the introduction of Dooley and Levinsohn (2001:iii):

First, we intend it to be practical, addressing issues commonly confronted by field linguists. Rather than attempting to apply a rigid theory or survey a variety of approaches, we provide a methodology that has been refined over years of use. Second, although we follow no rigid theory, we aim for more than a “grab-bag” of diverse methodologies by attempting to present the material within a coherent and productive framework. Specifically, we follow a functional and cognitive approach that seems to be a good approximation of how discourse is actually produced and understood.

The key phrase there is best approximating how discourse is actually produced and understood. This is the focus of applied linguists, not theorizing how language as a whole operates.

Where do linguistic theories come from?

This complaint about eclecticism demonstrates a disregard for the historical development of linguistic theory, including SFL. No linguistic theory has developed ex nihilo, nor do they continue to develop in hermetic isolation. Rather we most often find an evolutionary process at work, with each method building on those who go before. Halliday, Lyons and the early functionalists were reacting against the formalist movements of the day. Specifically, Halliday concentrated on tracing the development path, mapping all the different choices that are made in the production of a discourse as a functional system. His basic assumptions were sound about choice and meaning, about cohesion and coherence, but not all of his ideas panned out. Halliday began his theory by incorporating insights from the Prague School’s Functional Sentence Perspective (FSP), such as the notions of Theme/Rheme, given/new. He also worked with Hasan to make significant claims about coherence and cohesion. But not everything went right.

Halliday’s reformulation of the Prague School notions of theme and theme into his systemic theory led to problems. His claim that “what comes first” is theme doesn’t work outside of English, in languages like Greek, Hebrew, or any other highly inflected language. Similarly, his assignment  of given-new to the “tone group” works okay for English, but breaks down quickly in languages that use some other means than intonation to mark such things. For instance, Japanese uses particles to differentiate P1 from P2, not just intonation. So while there is much to praise about Halliday’s overall assumptions and insights into coherence and cohesion, there was a need to better account for what he mucked up in information structure. This is partly why there has been a split (or coup) within the SFL ranks, which I note in section 9.4 of my Discourse Grammar.

Other responses to the Prague School

There were other theorists besides Halliday working the problems left by the Prague School, such as Paul Grice’s  application of logic to pragmatic choice in contrast to Halliday’s socio-cultural focus. Grice developed a very concise summary of pragmatic “rules of engagement” that offered a better account of how such choices are made in language than Halliday could offer, summarized in Grice’s maxims of conversational implicature. Grice’s maxims were then further refined by Stephen C Levinson from the original seven maxims (which I can’t remember) to three: be brief, be relevant, be perspicuous. Neo-Gricean pragmatics could account for the kinds of socio-cultural influence that Halliday had on his to-do list but for which he offered no means of accomplishing. The pragmatics folks were not concerned with Halliday’s broader questions, but instead with solving the one piece about which they were most passionate. They carried on their specialized work on pragmatics, but those who followed behind took the insights from Gricean pragmatics and added these to the insights from Halliday about the systemic functional nature of language (choice implying meaning, etc.) , along with the Halliday and Hassan insights about cohesion and coherence, and carried on their merry way looking into other nagging issues. Recall that all of this was built on the correct insights from the Prague School.

Then along came Sperber and Wilson with their insight that Grice’s maxims could actually be distilled not just from seven to three principles, but to just one: the principle of relevance. Relevance Theory (RT) sought to re-envision the entire language production and comprehension process based on this one principle. In doing so, they demonstrated that the brain is indeed involved in language production, as Chomsky had theorized. But instead of there being some universal grammar hard-wired in our beings, they found that grammar and language were a natural outworking of our cognitive processing. RT was mostly consumed with English like Halliday, so others needed to adapt and redirect the basic insights in order to account for the typological data and patterns found in other languages.

Some focused on the cognitive aspects of language, including Wallace Chafe, Walter Kintsch, Ronald Langacker, and George Lakoff. Their work predates RT, but all were seeking to account for the same kinds of phenomenon. Chafe and Kintsch wanted to understand what we actually did with discourse was we processed it. Chafe found that we didn’t store the words we read or heard, at least not after the first bit of time had passed. Instead these words somehow became transformed into mental pictures or “representations” of what was processed. Kinsch, in his research into reading comprehension—what we really should be reading to develop a DA methodology for written texts in biblical studies!!!!—found that we iterate between a construction of Chafe’s mental representation and integration of what we already had in there, based on our knowledge of the world, previous experience, etc. Lakoff found that words don’t have meanings so much as meanings have words. When I say the word “dog,” what pops into your mind is likely slightly different from what pops into mine, yet there is an agreed-upon, prototypical range of meaning for “dog.” When this range is stretched, we mark this by calling it “dog-like” or something else. They found that most meanings have fuzzy edges, not the black and white boundaries envisioned by Aristotle . This has led to tremendous development in the area of cognitive linguistics and human comprehension. Nevertheless, it still builds on the bits that Halliday and others got right, but seeks to fix what he and others either got wrong or had no interest in accounting for.

Typologically-informed appproaches

Meanwhile, back at the ranch, there were others who sought to apply the Prague School insights—like Halliday—but outside of English. Simon Dik began looking at minority languages to determine whether theme and rheme were indeed universal across languages. He found they largely were, but Halliday’s account based on English was too idiosyncratic to scale out to non-western, languages. This led to the rise of (Functional) Discourse Grammar (FDG), a typological updating of the Prague School. It was Dik who theorized there were two functional reasons for marked word order represented in preverbal slots P2 and P1 (what I have called emphasis and frames of reference, respectively). They also worked through other issues overlapping with Halliday, but somewhat in isolation from what was going on elsewhere regarding cognitive approaches.

Chafe too was interested in information structure, specifically its intersection with cognitive processing. This led to the development of Construction Grammar, describing set linguistic patterns and functions as “constructions.” Chafe’s greatest disciple, Knud Lambrecht, pulled the pieces together from the Prague School, cognitive linguistics and mental representations, Gricean pragmatics, and the given-new distinction into a coherent account of information structure that offered a satisfying account from the shaping and production of utterances all the way through to the cognitive processing and storage by the hear/reader. Lambrecht’s model has subsequently been adopted almost wholesale in RT, Role and Reference Grammar, and elsewhere. Kintsch’s work in reading comprehension offered independent corroboration of many of Lambrecht’s ideas.

Objectives shape the theory

The differences we find among linguistic methodologies do not stem from one being right and another wrong; they stem from the differing questions that each theory is attempting to answer. SFL focuses among other things on the  socio-cultural factors which shape language use, but not the cognitive processing. As they have need to develop this other area, SFL can either reinvent the wheel or borrow from another theory that has already done the hard work of development. We see happening if we track the other functionalist who came along shortly after Halliday.

Folks like Foley and Van Valin asked a brand new question, partly in response to Halliday’s English-centric problems. What would a typology of language look like if it had not begun with a Western European language like English or German? Role and Reference Grammar (RRG) thus works toward a unified framework that can account for the kinds of things that Halliday first postulated, but without the predisposition to English. RT seeks to do the same thing, but in terms of the basic presupposition of relevance. Construction Grammar (ConstG) seeks to do the same thing. How? By taking the pieces that work from the other approaches, and filling in the the gaps or errors in light of their approach’s specific goals and objectives. They all begin with the formalist approaches of the Prague School that needed improvement, then build on Halliday’s understanding of cohesion and coherence, then build on the work of Chafe and Lambrecht for information structure and mental representations, which in turn all build on Gricean pragmatics and the basic assumption of relevance from RT. All have common roots and common presuppositions, but unique objectives. To the extent that their basic functional presuppositions agree, each can utilize the work of the others to improve the whole while accomplishing their unique objectives.

Consequences of purist theoretical approaches

So what is the problem of eschewing eclecticism to maintain a pure theoretical framework like SFL? Nothing, really. It is a valid choice, but is not without its drawbacks. To begin with, Halliday’s school of SFL has done little to incorporate the critical insights from cognitive linguistics in terms of mental representations, relevance and information structure. Some have maintained their purity to their own detriment as the refuse to incorporate new insights from other approaches. This isolation has also led to a revolt of sorts. The Cardiff school of SFL has broken with Halliday in Sydney on certain issues based upon the latter’s resistance to updating ideas in light of the revolutionary insights.

The purity of SFL within NT studies

Let us return more specifically to the application of SFL to NT studies by Porter. His claim of maintaining theoretical purity stands at odds with his incorporation of concepts from outside SFL. I have found no evidence of contrastive substitution being adopted and reworked within SFL (critiqued here), nor of Porter’s symmetrical approach to markedness theory (critique forthcoming). Both have been borrowed, which sounds like the kind of eclecticism about which he complains. I applaud him for using things that work; every applied linguist worth his salt does the same. My problem lies in Porter claiming to be operating purely within SFL when in fact foundational pieces of his theoretical framework are borrowed. The same could be said of Porter’s incorporation of frequency analysis; I have found no such approach evidenced within Halliday’s SFL. One should not play both sides of the fence.

Advantages of eclecticism

This has been a lousy attempt at narrating the history of functional linguistics, but my point is simple. If you trace any modern theory of language (SFL, RT, RRG, FDG, CogG, ConstG) back to its origins, you will inevitably find  evolutionary eclecticism along the way. Everyone has built on the bits that others have gotten right. The exceptions to this are the theorists, whose approaches necessarily demand that they reinvent the wheel in order assimilate X into their worldview of language. Most everyone else is eclectic to one extent or the other.

Adopting a purist approach necessitates not incorporating proven insights from another theory until it has been assimilated within ones own. Levinsohn and I are too practically motivated to do this. We are applied linguists, not theoretical ones, unless there is a need to do so. If you look at the table of contents from Dooley and Levinsohn’s Analyzing Discourse,  you will find this illustrated. They begin with basic insights from Leech that the number of speakers involved in the production of a discourse impacts the kind of discourse that results. This is then followed up by Longacre’s insights from his Grammar of Discourse into how agent orientation and sequentiality meaningfully explain how different genre’s come about, and why features in one genre might operate differently in another. They then move to Halliday’s insights from SFL into ideolect, style and register, adding another layer of complexity to our understanding of language. Then they shift from the complexities to the specific factors that hold a discourse together. D&L’s discussion of coherence and cohesion begins with Halliday & Hasan (H&H) because they nailed the basic concepts within SFL, but they didn’t provide the practical tools for working out these ideas at the sentence level.  This is where D&L shift to other approaches that do address these practical questions. This is where insights about cognitive processing from Chafe (CogG) on chunking—and Lambrecht (ConstG) on information structure—enter the picture, along with Levinsohn’s MA work on participant reference in Inga.

Levinsohn and I are indeed thoroughly eclectic, but the important point to recognize is that most everyone else doing applied work is as well, based on the evolutionary development of linguistic theory. We still would have been eclectic even if we were only using Construction Grammar or RT. Each has built upon the other.

Disdavantage of purist theoretical approaches

Being a true methodological purist has its drawbacks for the end user. It would have required that I not use proven insights from another approach or field such as cognitive poetics, which is rethinking poetics and literary analysis in light of insights into cognitive processing of language. I would only be able to use such insights after they had been assimilated into my theory.

More importantly for my task at hand, being a non-ecclectic purist would have necessitated that my Discourse Grammar readers learn my method’s jargon and idiosyncrasies rather than the accessible description that eclecticism has offered in the published version. If you wonder why it is easier to understand that some other works in our guild, my eclecticism is partly to blame. The purist approach typically complicates things much more than eclecticism based on the methodological constraints of the theory. Theoretical approaches are generally less applicable than applied ones since the theory constrains the approach rather than the data.

Goals drive our method

The real question to ask is this: What exactly is our goal for linguistic analysis in biblical studies? Is it to develop a unified theory of language, or to functionally describe the features and their contribution to the discourse? Both are legitimate, but I fear the two have been conflated as though the latter necessarily presupposes the former. There is nothing wrong with Porter developing a theory of language to account for the NT corpus, but there are plenty of existing theories besides SFL that may readily and legitimately be applied.

A purist application of Halliday would not necessarily lead to understanding the structure and flow of Romans. Instead it would mean developing a theory of language that could account for  the linguistic artifact we find in the NT book of Romans. Brown and Yule would also likely have a different objective for DA than most NT scholars would desire, since their focus is directed to spoken discourse rather than written. In my view, Walter Kintsch’s work on reading comprehension is really where we should be focusing, as it is thoroughly up to date in terms of cognitive processing, and it is specifically focused on the comprehension of written discourse. Spoiler alert: this is the direction I will be heading in 2015 in my volume on moving from discourse grammar to discourse analysis.

One final point: If I wanted to be a purist, I could have reformulated most all of my descriptions of discourse features under the auspices of RT, ConstG, CogG, RRG, and FDG, so long as I adapted to their idiosyncrasies. All are in basic agreement in terms of fundamental presuppositions, which is why I could operate within any one. However, the differing objectives, like RRG imagining a non-Western typology, necessarily lead to differences in the approaches. So eclecticism is not as reckless or strange as Porter has portrayed it, nor is methodological purity the only legitimate linguistic option. The reality is there is no perfect framework; each is focused on a different part of the same overall puzzle.

</rant>

<edit>

I trust that Dr. Porter is well aware of the principles and history that I describe above. I believe that he has framed the eclectic vs. pure theory issue in this way as a rhetorical advantage. Since 1989, he has been writing for an audience within our guild that is largely ignorant of the broader field of linguistics. People have taken his claims at face value assuming they accurately reflect the broader discipline outside NT studies. Those of us who have read more widely in linguistics proper recognize the advantage he has gained in framing things this way. The point of this post is to offer an alternative view on the matter.

If Porter wants to invest his energy in theorizing about the language of the GNT, there is nothing wrong with this. My calling is to provide something that has a more practical payoff for the guild. Denigrating eclectic linguistic approaches as illegitimate demonstrates an ignorance of how modern linguistic theory, especially functional theory, has developed. Keep that in mind as you hear this argument put forward in San Diego next month.

</edit>