Page 2 of 2

Re: Migley and Taylor Petrey Discuss Nibley

Posted: Thu May 21, 2020 1:45 am
by Physics Guy
Symmachus wrote:
Wed May 20, 2020 2:59 pm
[T]echnical knowledge used to matter in a way that it doesn't anymore. So of course the kinds of questions are different; there aren't many texts left to be edited, for one thing, and hardly anyone has the skills to do the few that are left ... .

The kinds of questions Petrey lists and Midgley dislikes—"new topics such as body, sexuality, race and ethnicity, empire, and material culture, to name a few"—are obviously reflective of the social and political preoccupations of the people employed in the field. I wish Taylor Petrey would be more explicit about that, presuming he is capable of that degree of self-reflection, because that shift is really what they are arguing over. This shift is what Midgley probably means by his absurdist short-hand of "sex and gender" because to opponents, anyone of these topics is as tendentious as another, and instead Petrey has chosen to play a rhetorical game with an old dotard like Midgley who doesn't care enough to make fine distinctions: sex/gender/empire/race/ethnicity/empire are to him all avatars of the same activist ____.

Now, it may be emergent senility or apologetic silliness that gives wing to Midlgey's hyperbolic claim that sex/gender (and the rest of the list) is all that is done in the field, but Petrey is a bit disingenuous in listing them as just some among many approaches. Hardly. These kinds of preoccupations dominate the agenda for the field. Hardly any dissertation advisor would say: "hey, why not try a philological commentary, or an edition of a text?" Not gonna happen. Nor would anyone get tenure for doing any doing an edition of a text or any kind of real philology or a book on the rise of the episcopate in Gaul without talking about sex/gender/empire/race/ethnicity/identity/etc. at considerable length, whether relevant or not. Look at the job announcements each year and you can see what the field is like now—hence the wisdom of the dissertation advisor in steering students away from doing anything other than interpretive work. It used to be you had to prove your technical competence before you could be taken seriously as an interpreter; now all is an endless cycle of interpretive updating so as to address contemporary social and political anxieties of upper-middle class academics (since Trump's election, for example, there has been a sudden interest in rethinking of Visigoths etc. as "immigrants," which is just totally anachronistic and obviously motivated...there is huge interest in "eco-criticism" and climate in antiquity with predictable arguments and tendentious claims...and on...and on...). Sometimes there are someone genuine insights that result, but not nearly enough to justify all of this. At the very least, scholars of the past should preserve knowledge of that past, but preservation matters less than production.

Midgley is not totally wrong, even if he is clumsy.
I get it that Midgley isn't really misrepresenting Petrey as talking about sexuality, even though that's what Midgley says, because really Midgley is just tarring with a broad brush and using "sexuality" as a code word for a whole range of trendy topics and the trend they rode in on. Either Midgley doesn't notice the distinctions between gender studies and eco-criticism or else pretending not to notice the differences is his way of sneering at their whole clan. But then perhaps we should give Petrey credit for pretending to take Midgley seriously. It could be a deadpan riposte.

Your description of Early Christian Studies makes it sound to me like a dying field. If I think of it as a dying sub-field within a larger field (ancient history? history in general?) then I don't think it's necessarily a bad reflection on the larger field if sub-fields within it go into decline. Academic fields can decline from too much success, leaving too little for new researchers to do. What's important, I think, is how the larger field handles the decline of a sub-field. There should be some effort to archive the dead field's successes so that the knowledge isn't lost, but the dead wood has to be cleared out with a firm hand to free up resources for other topics. Things should not be propped up when it's time to wind down.

In my own field I think that this has clearly happened to nuclear physics, which once was a byword for cutting-edge science but passed the point, a few decades ago now, where the only things left to do are impossible. So the journals have shrunk or folded, the funding has dried up, and few departments have anyone left who specializes in nuclear physics, though most still offer a course or two in it, or at least include a little bit of it in one of their courses. I expect that the next few decades will see particle physics go the same way.

Re: Migley and Taylor Petrey Discuss Nibley

Posted: Thu May 21, 2020 4:32 am
by moksha
Physics Guy wrote:
Thu May 21, 2020 1:45 am
Your description of Early Christian Studies makes it sound to me like a dying field.
Who knows? Perhaps it will be resurrected someday.

Re: Migley and Taylor Petrey Discuss Nibley

Posted: Fri May 22, 2020 5:12 pm
by Symmachus
Lemmie wrote: I do think you are giving Midgley way too much credit for making a cogent argument. What Petrey was opposed to seemed much more specific, and directly related to Midgley’s long-standing habit of making up personal slurs to lend support to an academic position:
You are clearly correct on the particulars, but where I diverge slightly, perhaps significantly, is that I don't believe this is really about the particulars. On the particulars, Midgley is mostly right about the Hillerbrand response to Nibley (though I don't remember Robert Grant being so effusive: he wasn't endorsing Nibley's position so much as justifying publishing it in the first place—that's how it came off to me, anyway). But Midgley and Petrey don't really disagree on that, do they? Petrey snaps at Midgley ostensibly because Midgley has misread him (in part), which is true but not complete. What Midgley doesn't like is that Petrey hasn't said Nibley is the greatest scholar of all time who proved the Church is true with his brilliant intellect and its Shakespeare quotation machine. The reason he divines for Petrey's lack of reverence for the Master is that Petrey is a willing captive of contemporary academic orthodoxy (that is what "sex/gender" stands for). That is true, and whatever else Petrey wrote is irrelevant for Midgley.

But why does Associate Professor of Religion Taylor Petrey care at all about some twighlight-zone blogpost comments made by an octogenarian whose grasp on these subjects is as loose as a Parowan paramour after a little relief society? He is obviously out for something rather more than trying to correct the record. I detect a bizarre though not unfamiliar touchiness here in the religious studies crowd. This isn't like Jenkins coming to argue about evidence or something; this is a tenured knight of the Mormon Studies round table going out of his way to prick some senile wingless dragon over some stupid point of order no one gives a ____ about (basically, "I never defiled the memory of Saint Hugh! I revere Saint Hugh!"), rounded off by a passive aggressive reference to wounded honor ("I know you're not going to apologize...").

The whole exchange following and interactions with Kiwi isn't less disingenuous than an apologist interacting with critic, or lodging a complaint with an insurance company: refuse to address the basic complaint of the critic and in a condescending fashion, redirect them to an official policy document that supposedly resolves the complaint ("I encourage you to read Liz Clark's article..." is a more polite version of "Hugh Nibley's work on the Book of Abraham already answered that" or "Our terms can be found on our website").
Gadianton wrote:I think that I do see your point, which means by extension that I also see Midgley's point. The issue I have is the apologists' glaring hypocrisy on this point.
I completely agree with that, but I find a related phenomenon happening all the time in how academics interact with the public or with critics: play the role of the descriptivist but then surreptitiously impose a prescriptivist agenda. See, for example, Petrey's SL Tribune piece from a few months ago in which he, writing in his role as a professor of religion, first makes the erroneous claim that biologic sex is a social construct (acting in his descriptive mode: "I'm just describing X"). Hardly anyone on the planet agrees with that, however, not least the LDS Church, the stance of which Petrey then characterizes as "enforcement" rather than description ("I'm just describing, but those guys are activists"). But the implication is that that "enforcement" has no biological validity, and therefore the original notion—that biological sex is a social construct—in his rhetoric is treated as normative. In other words, beneath his rhetorical pose of a disinterested professor explaining how things are, he is actually prescribing how things should be. Linguists do this all the time. "Language changes all the time" is true as a descriptive statement, but when a linguist says it as a response to the question "is zi a real English pronoun?" or the like, this descriptive truth is used, by implication, as the basis for defending the use of such pronouns: to reject the use of the pronoun means you are denying simple linguistic reality, and thus the descriptive fact becomes a prescriptive argument, the descriptivist assumes the role of prescriptivist. There is nothing wrong with making the argument, but I am revolted by this kind of rhetoric when the argument is made under the banner of the neutral arbiter of what is and what is not. If you're a pirate, fly the jolly roger, you thieving ____. Don't wave your goddamn union jack in my face. This is rhetorical sleight of hand practiced by academics engaging with the non-expert public all the time. The religious studies people do this all the time; they are some of the worst offenders. I doubt many of them even realize they are doing it, but then they are same ones to lecture us about "critical thinking" and the slipperiness of language and all that.
Physics Guy wrote:Your description of Early Christian Studies makes it sound to me like a dying field. If I think of it as a dying sub-field within a larger field (ancient history? history in general?) then I don't think it's necessarily a bad reflection on the larger field if sub-fields within it go into decline. Academic fields can decline from too much success, leaving too little for new researchers to do. What's important, I think, is how the larger field handles the decline of a sub-field. There should be some effort to archive the dead field's successes so that the knowledge isn't lost, but the dead wood has to be cleared out with a firm hand to free up resources for other topics. Things should not be propped up when it's time to wind down.
You know, it's really hard for me to tell what is dying anymore and what is just plain sick. The number of majors in these fields as a share of majors declines every year; history is in particular crisis. There are hardly any classes on early Christianity in even religious studies programs (with some few elite exceptions, of course). I taught one course in Early Christianity that had 7 students; every other lecture course in history that I taught ranged between 40-70. At the same time, there seems to be an endless stream of publishing. Half the time I can't tell if the subfield that styles itself "Late Antiquity" is actually a subset of religious studies (note how Petrey cites Peter Brown's Body and Society as transformative, although Brown is usually thought of as the godfather of the subfield of Late Antiquity). Some of the Late Antiquity people have grumbled about this. I suppose it depends on what metric you want to use. Book reviews, always rife with cliches, endlessly talk about the "explosion of interest in the last few decades" etc. and you hear the same thing in conference papers, but judged by student demand, actually no one gives a ____. I suspect the gulf has to do with the structure of the academic economy, since the people doing most of the publishing are quite insulated from market forces. Or they ignore them in hopes of one day being insulated from them.

Re: Migley and Taylor Petrey Discuss Nibley

Posted: Fri May 22, 2020 6:18 pm
by Lemmie

But why does Associate Professor of Religion Taylor Petrey care at all about some twighlight-zone blogpost comments made by an octogenarian whose grasp on these subjects is as loose as a Parowan paramour after a little relief society?
You do have a way with words, Symmachus!

He is obviously out for something rather more than trying to correct the record. I detect a bizarre though not unfamiliar touchiness here in the religious studies crowd. This isn't like Jenkins coming to argue about evidence or something; this is a tenured knight of the Mormon Studies round table going out of his way to prick some senile wingless dragon over some stupid point of order no one gives a ____ about (basically, "I never defiled the memory of Saint Hugh! I revere Saint Hugh!"), rounded off by a passive aggressive reference to wounded honor ("I know you're not going to apologize...")

The whole exchange following and interactions with Kiwi isn't less disingenuous than an apologist interacting with critic, or lodging a complaint with an insurance company: refuse to address the basic complaint of the critic and in a condescending fashion, redirect them to an official policy document that supposedly resolves the complaint ("I encourage you to read Liz Clark's article..." is a more polite version of "Hugh Nibley's work on the Book of Abraham already answered that" or "Our terms can be found on our website").
I had assumed some Maxwell-related bad blood had sparked this originally, but I am beginning to see your point.

See, for example, Petrey's SL Tribune piece from a few months ago in which he, writing in his role as a professor of religion, first makes the erroneous claim that biologic sex is a social construct (acting in his descriptive mode: "I'm just describing X"). Hardly anyone on the planet agrees with that, however, not least the LDS Church, the stance of which Petrey then characterizes as "enforcement" rather than description ("I'm just describing, but those guys are activists"). But the implication is that that "enforcement" has no biological validity, and therefore the original notion—that biological sex is a social construct—in his rhetoric is treated as normative. In other words, beneath his rhetorical pose of a disinterested professor explaining how things are, he is actually prescribing how things should be. Linguists do this all the time. "Language changes all the time" is true as a descriptive statement, but when a linguist says it as a response to the question "is zi a real English pronoun?" or the like, this descriptive truth is used, by implication, as the basis for defending the use of such pronouns: to reject the use of the pronoun means you are denying simple linguistic reality, and thus the descriptive fact becomes a prescriptive argument, the descriptivist assumes the role of prescriptivist. There is nothing wrong with making the argument, but I am revolted by this kind of rhetoric when the argument is made under the banner of the neutral arbiter of what is and what is not. If you're a pirate, fly the jolly roger, you thieving ____. Don't wave your goddamn union jack in my face. This is rhetorical sleight of hand practiced by academics engaging with the non-expert public all the time. The religious studies people do this all the time; they are some of the worst offenders. I doubt many of them even realize they are doing it, but then they are same ones to lecture us about "critical thinking" and the slipperiness of language and all that.
Fascinating. Thank you for the peek into the inner workings. My field and its attendant research approach is a little more cut and dried so this background into a different world is much appreciated.

Re: Migley and Taylor Petrey Discuss Nibley

Posted: Sun May 24, 2020 6:58 am
by Physics Guy
Symmachus wrote:
Fri May 22, 2020 5:12 pm
"Language changes all the time" is true as a descriptive statement, but when a linguist says it as a response to the question "is zi a real English pronoun?" or the like, this descriptive truth is used, by implication, as the basis for defending the use of such pronouns: to reject the use of the pronoun means you are denying simple linguistic reality, and thus the descriptive fact becomes a prescriptive argument, the descriptivist assumes the role of prescriptivist.
I have a linguistics professor in my family and all the linguists I know roll their eyes at all forms of grammar fascism. "Zi" could be a real English pronoun. Or not. It's up to us, and there is no other right answer beyond that. That is the accurate description of human language that linguists have found from thorough observation.

If there are otherwise some good reasons for introducing "zi", and if the idea that there is an eternally closed class of valid English pronouns is the only reason against using "zi", then sure, linguists are kicking the legs out from under the opposition to "zi". I have a hard time seeing this as linguists being prescriptive, though, or in any way dishonest. Suppose I call you in as a scholar of ancient languages to support my contention that kids should all have to read the King James Bible in school because its exact letter-for-letter text was dictated in English by Moses four thousand years ago and so it's the oldest book in the world. When you inform me that the King James Bible was not dictated by Moses, my position is weakened, but I don't believe I can blame you for being prescriptive.
You know, it's really hard for me to tell what is dying anymore and what is just plain sick. The number of majors in these fields as a share of majors declines every year; history is in particular crisis.
In the old, old days doctors and lawyers and clergy were being trained professionally, but for everyone else higher education was pretty much like training to play polo. There was a lot to learn, all right, but it didn't enable you to earn a living.

At some point it started to be argued that spending three or four years learning about old novels (say) actually was a useful professional training after all, because it was cross-training. The actual subject of study might have been a bunch of old novels but in writing term papers about them the student learned to read and think and write, and after graduating they would then be able to turn those finely honed reading and writing and thinking skills onto business administration.

As a defense of higher education in impractical subjects that has a terrible weakness. If the skills of reading and writing and thinking have so many applications in business, and if all those novels were merely placeholder dummies used in training those skills, then why not swap out the novels for business plans and let students have the best of both worlds?

Subjects like mine face the same challenge. The quantum mechanics industry is, shall we say, underdeveloped. We also argue for cross-training. Where I think we argue better than I have heard humanities professors argue is in saying why our placeholder dummy subjects are actually particularly good ones for training and so do add some value. Current industry practice is constantly changing but what will be possible tomorrow will still mostly be set by natural laws known today, so we can claim that our pure physics graduates are future-proof in a way that engineers aren't.

There has to be something like that to say why the particular subject of study is more than merely incidental to the externally valuable training that is being delivered. Most of what I've heard from humanities advocates is only that engineering and commerce grads write and think badly. That may be true but it's an argument for training professionals better, not for supporting higher education in the humanities. It's very possible that better cases for the humanities are frequently made and I just haven't heard them—my career has been almost all at science-heavy institutions.

In the long run I think that any academic subject needs to focus on some kind of practical value, not just to pay the bills, but to keep itself healthy. I don't think that internal peer pressure is a strong enough force to keep people working as hard as it takes to do hard things right, because the internal pressure to do the right thing disappears whenever there's a trend that allows everyone to slack off together. Everyone's keeping an eye on everyone else to see whether skimping on X is okay now; everyone has to do that, because if you're the last person still expending effort in a direction that no-one else cares about any more, then you'll have less effort than everyone else to give to other things and your work will look worse than theirs.

An external standard saves a field from that by keeping everyone honest.

Re: Migley and Taylor Petrey Discuss Nibley

Posted: Sun May 24, 2020 10:29 am
by Symmachus
Physics Guy wrote:
Sun May 24, 2020 6:58 am
I have a linguistics professor in my family and all the linguists I know roll their eyes at all forms of grammar fascism. "Zi" could be a real English pronoun. Or not. It's up to us, and there is no other right answer beyond that. That is the accurate description of human language that linguists have found from thorough observation.

If there are otherwise some good reasons for introducing "zi", and if the idea that there is an eternally closed class of valid English pronouns is the only reason against using "zi", then sure, linguists are kicking the legs out from under the opposition to "zi". I have a hard time seeing this as linguists being prescriptive, though, or in any way dishonest.
What you describe is not prescriptive or dishonest; what I referring to was the use of this stance by linguists in their engagement with the public in their capacity as linguists in order to lend authority to a dispute that cannot be resolved by appealing to linguistic arguments (see here, for instance, or here or here—it's not hard to find linguists pushing this qua linguists).
Physics Guy wrote:
Sun May 24, 2020 6:58 am
Suppose I call you in as a scholar of ancient languages to support my contention that kids should all have to read the King James Bible in school because its exact letter-for-letter text was dictated in English by Moses four thousand years ago and so it's the oldest book in the world. When you inform me that the King James Bible was not dictated by Moses, my position is weakened, but I don't believe I can blame you for being prescriptive.
I don't think this is an instructive example. The question of whether or not the King James Bible was produced 4,000 years is a factual question, not a normative one. That is not the case with the pronoun question. A linguist can establish, factually, that language changes all the time by documenting what changes have occurred and what seems to be occurring. The English pronoun "they," for example is in a sense not English at all but was actually borrowed from Old Norse. There are technical bits of evidence within a closed system that establish this. Whether or not "they" should from here on out be used as a gender neutral pronoun is not a factual question. No one needs a linguist's permission or authority for anything.
Physics Guy wrote:
Sun May 24, 2020 6:58 am
In the old, old days doctors and lawyers and clergy were being trained professionally, but for everyone else higher education was pretty much like training to play polo. There was a lot to learn, all right, but it didn't enable you to earn a living.

At some point it started to be argued that spending three or four years learning about old novels (say) actually was a useful professional training after all, because it was cross-training. The actual subject of study might have been a bunch of old novels but in writing term papers about them the student learned to read and think and write, and after graduating they would then be able to turn those finely honed reading and writing and thinking skills onto business administration.

As a defense of higher education in impractical subjects that has a terrible weakness. If the skills of reading and writing and thinking have so many applications in business, and if all those novels were merely placeholder dummies used in training those skills, then why not swap out the novels for business plans and let students have the best of both worlds?
You are describing the current state of things. Students hardly get any deep exposure to a humanistic education except for the minority who choose to study them in a major. Actually, what you are really saying is that people don't need to spend any time in higher education for something like business, which I agree with. A bachelor's degree for most people is simply a badge of entrance into an interview for job where they will use almost none of the knowledge base that they have gone into severe debt to acquire.
Physics Guy wrote:
Sun May 24, 2020 6:58 am
There has to be something like that to say why the particular subject of study is more than merely incidental to the externally valuable training that is being delivered. Most of what I've heard from humanities advocates is only that engineering and commerce grads write and think badly. That may be true but it's an argument for training professionals better, not for supporting higher education in the humanities. It's very possible that better cases for the humanities are frequently made and I just haven't heard them—my career has been almost all at science-heavy institutions.
I would say first that the overwhelming majority of my students have been engineering students. The next biggest group were physics majors and students in lab sciences like biology and chemistry or PharmD students. I enjoyed this dynamic because what I was presenting to them was so different from their usual experience, and most usually very engaged with the courses I taught. It is true that their writing had certain features in common, namely, that they approached every problem as part of a closed system, i.e. a well-defined set of evidence to which a set of rules could rationally be applied in order to reach a sound conclusion. The fun was that history doesn't work that way. The worst students were mostly in business (not to say that the business students as a class were worst, only that the worst were invariably business majors who thought anything that wasn't a multiple choice test was an easy A you get through ____).
Physics Guy wrote:
Sun May 24, 2020 6:58 am
In the long run I think that any academic subject needs to focus on some kind of practical value, not just to pay the bills, but to keep itself healthy.
On the one hand, I would replace "academic subject" with "trade school" or the like. I see nothing wrong with trade schools at all, but this distinction has lead us to where we are now because emphasis on practical value (whatever that means) has lead to cultural illiteracy among people who are supposedly the most educated. Universities used to be a place for wasting time before going on to law or politics, true, but wasting time on becoming cultured and educated by spending time around people whose overriding concern was the production and preservation of knowledge. It's still a place for wasting time for people who don't have to develop a technical skill set for their profession, but what is that time being wasted on? Education is always synonymous with culture. You're going to have a culture one way or another, and their will be institutions that set that agenda, so the question is, what kind of agenda should those institutions set—what kind of culture do you want to have?
Physics Guy wrote:
Sun May 24, 2020 6:58 am
I don't think that internal peer pressure is a strong enough force to keep people working as hard as it takes to do hard things right, because the internal pressure to do the right thing disappears whenever there's a trend that allows everyone to slack off together. Everyone's keeping an eye on everyone else to see whether skimping on X is okay now; everyone has to do that, because if you're the last person still expending effort in a direction that no-one else cares about any more, then you'll have less effort than everyone else to give to other things and your work will look worse than theirs.

An external standard saves a field from that by keeping everyone honest.
I think I agree with much of this. I certainly don't agree with people like Stanley Fish, who argue the exact opposite ("don't ask us English professors what our value is: our value is being English professors, and English professors are best positioned to determine what that means, so just keep sending us the tax payer money directly or indirectly through debit cards into the government coffers—I mean, uh, students, sorry"). I'm not sure that "external standard" is something self-evident, but if we think of that external standard as being set by the people paying for all of this, then I think it is much better for the humanists/liberal arts advocates to argue on the cultural value of what they do because most people want to know that there are people preserving and disseminating their cultural inheritance somewhere, even if they don't want to do that themselves. It is a kind of hobby, in a sense, but in the way that classical music is: you have to dedicate your life to it in order to preserve it. It will disappear entirely if it is left to a few Saturdays a month after the lawn is mowed. I think most people don't want that to happen, and I have been pleasantly surprised in making arguments of this sort to engineering-types. The great failure to my thinking has been with the liberal arts advocates who accept the utilitarian premise, claiming that they can teach some superior kind of knowledge that has some practical value, but then they offer something that you really can just ____ your way through. Then students get the bill, with a hefty fee for the government's picking up the tab for a few years.

Re: Migley and Taylor Petrey Discuss Nibley

Posted: Sun May 24, 2020 11:30 am
by Holy Ghost
It seems that Mormon apologists have an easier time acknowledging the faults of Joseph Smith than they do Hugh Nibley.

Re: Migley and Taylor Petrey Discuss Nibley

Posted: Sun May 24, 2020 12:35 pm
by Doctor CamNC4Me
Symmachus,

From your first link above this bit piqued my curiosity:

“Pronouns are by their nature political, and history shows how culture and politics sometimes drive language change. Classic examples include the use of thee among Quakers and the history of you following the Norman invasion of England. It isn’t by accident that we no longer use ye and thou.”

For a layperson such as myself, would you mind pointing me to some good sources that might expand upon the quote above, or if you’re inclined would you mind hashing that bit out so I can better understand the politics of pronouns from a historical perspective, specifically ‘thee, you, ye, and thou”?

- Doc

Re: Migley and Taylor Petrey Discuss Nibley

Posted: Sun May 24, 2020 7:17 pm
by Philo Sofee
Holy Ghost wrote:
Sun May 24, 2020 11:30 am
It seems that Mormon apologists have an easier time acknowledging the faults of Joseph Smith than they do Hugh Nibley.
I suspect it's because Joseph Smith as so vastly much more easily seen to be totally wrong than Hugh Nibley can be seen to be so. Also, whether they like it or not, Joseph Smith just didn't have the academic clout Nibley had, and in today's world, that trumps spiritual thinking and prophecy, especially with the apostate Ph.d holding apologists like Peterson and Midgley and their followers.

Re: Migley and Taylor Petrey Discuss Nibley

Posted: Sun May 24, 2020 9:24 pm
by Symmachus
Doctor CamNC4Me wrote:
Sun May 24, 2020 12:35 pm

“Pronouns are by their nature political, and history shows how culture and politics sometimes drive language change. Classic examples include the use of thee among Quakers and the history of you following the Norman invasion of England. It isn’t by accident that we no longer use ye and thou.”

For a layperson such as myself, would you mind pointing me to some good sources that might expand upon the quote above, or if you’re inclined would you mind hashing that bit out so I can better understand the politics of pronouns from a historical perspective, specifically ‘thee, you, ye, and thou”?
The ignorance in the quote is as abysmal as its rhetoric is tendentious.

But to the issues raised. The basic fact is that "you" is the objective form of the pronoun "ye," which is the plural form of the singular pronoun "thou" (its objective form is "thee"). What happened in English (and not only English: see Brazilian Portuguese), is that the plural objective form supplanted all of the forms, so that now we use "you" both as singular and plural, in both objective and subjective uses. It is the only pronoun with no grammatical variation (e.g. I > me, he > him, she > her, we > us, but you > you), although English morphology has been so reduced that I'm not sure it is all that meaningful. Why did this happen? A number of reasons, but politics is not one of them, despite the claim in the quote.

The passage you quote is misleading you:

1. There is nothing that makes pronouns "by their nature" political. Anything can be used for political ends—even surgical masks have taken on political significance at the moment—but people have to make them political. Using you vs. thou did not mark your political preferences in the 16th or 17th century, though it was something like class marker. It was a consequence of the social transformation of English society as whole, of which the political was one but by no means the most significant element: the rise of the middle class.

2. The assumption behind this quotation seems to be one that is pervasive in the social sciences and humanities, namely, that anything that has a social motivation is actually political because everything is ultimately reduced (in their minds) to politics and power. Everything is assumed to be a function of power. His pre-existing values are coloring his read of the evidence.

3. This interest in power (faith in power?), though unstated, perhaps explains the reference to the Norman Conquest, a brute exercise of power, to be sure, but one which has nothing to do with this issue and makes me think all of this is only half-remembered from his graduate seminar in the history of English. The pronouns "ye/you" and "thou/thee" are Early Modern English (1500s to 1700s). They are a continuation of parts of the Middle English pronoun system (1100s-1500, with various spellings), and the Old English pronoun system before it (pre-1100s). But the Norman invasion of England was in 1066, so centuries earlier than "ye/you" and "thou/thee" existed as such, and English never adopted any pronouns from Norman French at any stage, so I have no idea why he mentions this. It was during this period that northern varieties of English did borrow some pronouns from Old Norse, and by the 14th century you can find both the "pure" English pronouns "hie/hem/heora" and "they/them/their" in elite literature produced in the court of Riccardian court (e.g. Chaucer uses forms of both). This is because there was no uniform standard of the language, certainly nothing imposed through the politics of the ruling class.

4. He may be referring to the fact that the plural of the Middle English pronoun (which later became our "you") was used to address aristocrats by other aristocrats, which has parallels with French. This was really quite some time after the Norman conquest, though, and it has nothing to do with politics, because it was a pan-European phenomenon that still exists in the Romance languages, German, Russian, etc. There was no political decree saying "we will now use this pronoun..." It also had a long history behind it through the Latin that all of them considered a prestige language. In Latin, this usage began in the later Roman Empire. A social motivation behind a linguistic phenomenon is not the same as a political one. There are, actually, very few examples of politically-derived linguistic change, at least successful attempts. The ban on the old forms of address in Russian after 1917 is one example: gospodin and gospozha, "lord" and "lady," were outlawed in favor of grazhdanin and grazhdanka, "citizen" and "citizeness," though most people ended up using the Communist Party's tovarisch, "comrade." The former was a political change, but not the latter. It was a social convention.

5. What does he mean "it isn't by accident that we no longer use ye and thou"? It's not entirely clear why the shift happened, but it is clear that nobody imposed this through some linguistic policy. Politics is the negotiated exercise of power through policy, but there was no general language policy and never has been in the Anglosphere (even the USA doesn't have an official language to this day). Almost alone among European countries, England has never had a language academy to impose standards (or invent them). Certain views of certain grammarians prevailed for reasons that had little to do with politics. One argument runs that, over the course of the 16th century, "thou" started to feel less polished than "you" perhaps because the emerging middle classes had social pretensions: by using "you" they aped the linguistic habits of the aristocracy to which many were gaining entrance, although most of the old aristocracy had been decimated by dynastic civil wars of the late 15th century. Thus, for many of the new entrants ennobled by the Tudors and Stuarts—both dynasties were promiscuous in handing out titles—and the new gentry of the 16th century, using "you" became a linguistic symbol of their new status. But not all of these people were politicians, and it is fallacious to see every example of social distinction as political or primarily about power. Later grammarians set this up as standard, but this was after the fact. But using "you" vs. "thou" did not mark your politics. In the literature period, they are used interchangeably, often within a few lines of each in a Shakespeare play. Some people try to read a lot of significance into that, but I think it's a symptom of shifting linguistic landscape. In the south, you will of course here "y'all" for a group of people but you will also hear "you" sometimes in the same conversation to refer to a group of people. This is because the pronoun "you" remains a source of instability (hence the varieties like "y'all" and "you'ns" and so on). The "thou/you" spread was probably similarly unstable. Same thing happened earlier in English (Chaucer, as I mentioned, uses both forms of the "purer" English pronoun like "hem" and the one borrowed from Old Norse, "them," because the linguistic situation in Southern England was quite fluid). Some people want all of this to reflect politics and resistance to power or some other variant of post-colonial theory but there is simply no evidence to support this. One has to read that it into it, which means they discover exactly what they set out to find. I hate this kind of thinking, whether it is in the history of English pronouns or in chiasmus in the Book of Mormon.

6. The example of the Quakers he references is slightly more relevant, though not in the way he wants it to be, and it should be instructive for the tiny fraction of the infinitesimally small slice of the minority of language activists in academia who want the other half billion English speakers to accommodate their desire for linguistic validation of their identities. Insofar as the Church of England was an instrument of the English state in the 17th century, the Quakers were political dissenters, and their emphasis on egalitarianism had a linguistic manifestation in "plain speech." That is to say, they rejected the social distinction still felt as implicit in the usage of "you" rather than "thou." So they tried to enforce the usage of "thou," though without success and often incorrectly, using "thee" for everything rather than distinguishing "thou" as subject and "thee" for everything else (Mormons sometimes also make grammatical mistakes in their usage of "thee" and "thou"). That tells you right away that it wasn't part of the natural speech of most Quakers, and they couldn't keep it going. It is still a stretch to see this as political primarily, rather than reflecting the religious values of the early Quakers (hard to separate those categories during this period, as I say), but certainly it shows that an attempt to control the language within a small community ultimately wasn't very successful because it went against the grain of the natural speech habits of its members. The same will happen with this pronoun business. That is essentially what all of the attempts to inject into the language a new usage of "they" (or any of the other suggestions on offer) that isn't recognized by most speakers. It's hard to enforce an archaism, which is what the Quakers attempted, but imposing an entirely new usage will indeed be political because it will require significant state intervention when it doesn't naturally arise out of the speech community more broadly. Such state intervention has already happened in some places.

The fact that language changes is not an argument against anything in this debate anymore than the fact that grass grows is an argument against mowing it.

Re: Migley and Taylor Petrey Discuss Nibley

Posted: Mon May 25, 2020 6:26 pm
by Doctor CamNC4Me
Thanks, Symmachus! That was a very informative and thoughtful response to my RFI. I really appreciate it when a scholar takes a moment to use his craft to enable us to better understand a topic. The bit about Quakers reminds me of the LDS penchant for using the familiar when addressing deity, which is backward within the Spanish speaking world. We'd also have to use the formal forms of pronouns when talking to common folk, which was very confusing and often off putting to them. They felt like we were being stand offish or cold. Interesting, indeed.

- Doc

Re: Migley and Taylor Petrey Discuss Nibley

Posted: Tue May 26, 2020 4:24 am
by Physics Guy
Symmachus wrote:
Sun May 24, 2020 10:29 am
[W]hat I referring to was the use of this stance by linguists in their engagement with the public in their capacity as linguists in order to lend authority to a dispute that cannot be resolved by appealing to linguistic arguments (see here, for instance, or here or here—it's not hard to find linguists pushing this qua linguists).
Hmm. The New York Times site flashed me some annoying wall and I didn't read it but neither of the other two links seemed to me to be pushing anything. One was this one linguist explaining that singular they has a long history and that usage varies in practice and over time. The other was news that a linguistics society has picked singular they as the word of the decade. I reckon they have a perfect right to do that; why not? It's an interesting linguistic trend that also gets linguistics a little time in the limelight. I don't see linguists in either of these cases doing anything more than observing facts. Where does either of them try to use linguistics to push anything?
The assumption behind this quotation seems to be one that is pervasive in the social sciences and humanities, namely, that anything that has a social motivation is actually political.
Aren't you just complaining about somebody using "political" in a broader sense than you do, to mean by definition "anything that has a social motivation"? I mean, that's a longish phrase that calls out for abbreviation, so why not "political"?

I agree that the fluidity of language says nothing about whether or not one should use "they" or other syllables as gender-neutral pronouns, any more than the fact that grass grows is a reason not to mow it. I think there are arguments made against gender-neutral pronouns, however, that are analogous to "we must mow the grass because long grass is unnatural". The fact that grass grows is an knock-down argument against that. The fact that language changes is a knock-down argument against the claim that singular "they" for example is just wrong because it's bad English grammar. If there are other arguments against singular "they", such as that it can be ambiguous or that nobody ought to be hiding their gender behind a pronoun or something, then I agree that linguistics says nothing about that. Well, maybe about ambiguity. Some linguists do study that kind of thing.

Re: Migley and Taylor Petrey Discuss Nibley

Posted: Tue May 26, 2020 7:24 am
by Symmachus
(duplicate deleted)

Re: Migley and Taylor Petrey Discuss Nibley

Posted: Tue May 26, 2020 7:25 am
by Symmachus
Physics Guy wrote:
Tue May 26, 2020 4:24 am
Hmm. The New York Times site flashed me some annoying wall and I didn't read it but neither of the other two links seemed to me to be pushing anything. One was this one linguist explaining that singular they has a long history and that usage varies in practice and over time. The other was news that a linguistics society has picked singular they as the word of the decade. I reckon they have a perfect right to do that; why not? It's an interesting linguistic trend that also gets linguistics a little time in the limelight. I don't see linguists in either of these cases doing anything more than observing facts. Where does either of them try to use linguistics to push anything?
Wow. Then you've fallen straight for it and offer an example of how the covert prescriptivism inherent here in their presentation of the observed facts works—you sincerely didn't think anything you read was taking one side in the controversy over how someone's gender identity is reflected in pronouns? You thought they were simply observing "an interesting linguistic trend?"

It is not a trend anymore than an advertising campaign reflects a trend. On the contrary, it's a sale pitch and an attempt to create a trend. This is not not some widespread societal trend that linguists are simply taking note of. It's a niche issue (or used to be, until it became political) that uses academic authority to impose a sense of legitimacy on people who otherwise find it laughable when they encounter it. Pieces like this are meant to use the authority of linguistics to take this proposed language shift seriously. They are not describing a shift that has occurred or is occurring on any significant scale but rather a shift they would like to see.

I have no problem with people's attempt to change the language (though I don't think it will be successful); my problem is with how linguistics is being used here. There is no evidence that "they" has ever been used by a community of English speakers as pronoun to refer to individuals whose gender identity is outside the male-female dichotomy (the pronoun "it" has been used for that, pejoratively), but that is what is implied in journalism about this attempted language shift when it is not stated outright. The rhetorical switch occurs with the word "gender," which for most people refers either to a person's sex at birth or how a person articulates their relationship to their own body (or something like that). English speakers aren't accustomed to thinking of their language has having gender, and anyone who has ever taught a Romance language (or any number of other languages) has had to explain to a lot of students that grammatical gender is not the same as personal gender or gender identity. Yet phrases like "gender-neutral" aren't being explained in this way and are being used rather imprecisely by linguists; when non-linguists, reading this in the context of a dispute about referring to gender identity sees that "they" has been a "gender-neutral" pronoun referring to one person for centuries, they are being told that English already contains and long has contained a pronoun to refer to individuals who articulate their identities outside the male-female dichotomy. This is obviously the thrust of these pieces.

It is also not true. Grammatical gender means something else, and if we used a different word for grammatical gender the rhetorical manipulation would be obvious or impossible. That "they" is a gender-neutral pronoun is not even formally true: in referring to people, it's not gender-neutral but gender non-specific, which is a different thing. Not marking gender doesn't mean the person/s you are referring to are any of the personal identity items outside the male-female dichotomy for most speaker in this history of the language. It just means you don't know or don't feel the need to mark gender grammatically. This is not linguistically unusual. Persia, for example, doesn't have a gendered third person pronoun in the singular, but it would be absurd to think that speakers in the Islamic Republic of Iran imagine, when they say "u" (the 3rd person pronoun), that they are referring to someone of a non-binary gender identity. They are simply not specifying a gender of the referent, but it doesn't mean they presume the referent is non-binary. Using "they" in this way as most English speakers do, unlike using "he" or "she," simply makes no comment one way or the other but it does not foreclose the referent's being male or female. On the other hand, using "they' to refer to someone who identifies as "non-binary" does make a comment about gender identity and forecloses their being male or female. That is the observable fact, Physics Guy. What pieces like this do is take the first usage to lend historical linguistic support in arguing for the second. That is telling you an incomplete fact to lend one side of a cultural debate.

Surely, it is not lost on you that the links I posted are obviously in the context of using "they" in this second usage, as a pronoun to mark non-binary gender-identity: note that beside "They" as the "Word of the Decade," the "Word of the Year" was "(My) Pronouns"—most of the half billion English speakers will have no idea what the hell of either of these are referring to unless they have been associated with a small corner of academia in the past decade or read consume elite media. Here we have a small group of speakers in the community of English speakers, who carry a certain level authority in that community more broadly and who have institutional position that adds weight to their words, pronouncing a particular linguistic usage as normative that otherwise does not have much currency. That is the very definition of linguistic prescriptivism.
Physics Guy wrote:
Tue May 26, 2020 4:24 am
Aren't you just complaining about somebody using "political" in a broader sense than you do, to mean by definition "anything that has a social motivation"? I mean, that's a longish phrase that calls out for abbreviation, so why not "political"?
It's a complaint, if that's what you want to call it, motivated by my being in favor of precision in intellectual discussion that has understanding as its goal. Calling something "political" injects needless confusion and hinders understanding when that thing actually has nothing to do with the sorts of phenomena that most people associate with the word "political." It also has the danger of opening up a non-political discussion to the application of politics, which ultimately means institutional power. If some linguist nesting in the authority of an academic position says that "all pronouns are by nature political," then the prudent thing to do is push for a policy change to enforce whatever the linguistic expert says should be the case. No surprise that this has begun to happen. This pronoun issue used to be a purely academic one until universities started to reflect one side of the debate in their policies, and other institutions, HR departments, and even governments have followed suit. The fact that they get so much resistance is evidence that this is being imposed rather and does not reflective common usage.
Physics Guy wrote:
Tue May 26, 2020 4:24 am
I agree that the fluidity of language says nothing about whether or not one should use "they" or other syllables as gender-neutral pronouns, any more than the fact that grass grows is a reason not to mow it. I think there are arguments made against gender-neutral pronouns, however, that are analogous to "we must mow the grass because long grass is unnatural". The fact that grass grows is an knock-down argument against that. The fact that language changes is a knock-down argument against the claim that singular "they" for example is just wrong because it's bad English grammar. If there are other arguments against singular "they", such as that it can be ambiguous or that nobody ought to be hiding their gender behind a pronoun or something, then I agree that linguistics says nothing about that. Well, maybe about ambiguity. Some linguists do study that kind of thing.
Of course there are bad arguments made against it but it doesn't really matter because they reflect the underlying fact that most people just don't use pronouns in this way and never have. What is "grammatically correct" ultimately reflects what the community of speakers writ large determines as normative for their language. On this issue, you have small group of people not reaching anywhere near half a million in number using their institutional clout and their control of media organs to tell the other half billion English speakers how to think about English pronouns. It's the classic definition of perscriptivism, rendered absurd in being presented as simply "observing the facts."

Re: Migley and Taylor Petrey Discuss Nibley

Posted: Tue May 26, 2020 7:41 am
by Physics Guy
Symmachus wrote:
Sun May 24, 2020 10:29 am
I think it is much better for the humanists/liberal arts advocates to argue on the cultural value of what they do because most people want to know that there are people preserving and disseminating their cultural inheritance somewhere, even if they don't want to do that themselves. It is a kind of hobby, in a sense, but in the way that classical music is: you have to dedicate your life to it in order to preserve it. It will disappear entirely if it is left to a few Saturdays a month after the lawn is mowed. I think most people don't want that to happen, and I have been pleasantly surprised in making arguments of this sort to engineering-types.
I think it depends how big a piece of turf you want to defend. A certain minimal last bastion of academic humanities can probably survive quite securely on the grounds of preserving culture, like classical music. To my mind it certainly should survive, and I think that probably enough people agree to make the strategy viable. I work in a country where all universities are funded by tax money, and where classical music is also subsidized with tax money. So people here buy this argument, literally. Whether American colleges will be able to pull that off, I don't know.

In the halcyon days when the returning GIs all went to college and then the Baby Boom hit college age, humanities departments had large enrollments. The flag advanced. Now the fire sinks on dune and headland and all our pomp of yesterday etc. Is everyone prepared to shrink back to the minimal bastion with far fewer positions in much smaller departments? The appeal to practical value is a tactic for defending more turf than cultural preservation would be able to hold, and keeping some of those students who might be okay with some classical music if they can get a good job.

I'd be happy if the humanities could keep a decent share of campus turf. They're never going to be serious competition for funding because they don't need to buy lasers, so it's not even any skin off my nose. Everybody in my family took at least some humanities courses in college and liked them, and I'm well prepared to believe that the humanities actually can make a good case for their practical value. By studying a bunch of old novels and stuff, you may be reading about fictitious events invented for entertainment, but you are also studying some of the best writing produced on any subject in the past few centuries. By studying a bunch of old political controversies, you may be studying stuff from another world, but you are also learning from case studies in decision making and conflict management when the stakes were really high. And so on. The past isn't dead. It is not even past.
The great failure to my thinking has been with the liberal arts advocates who accept the utilitarian premise, claiming that they can teach some superior kind of knowledge that has some practical value, but then they offer something that you really can just ____ your way through. Then students get the bill, with a hefty fee for the government's picking up the tab for a few years.
Right. The case for practical value can be made but it has to be made properly. It can't just be tossed off as an attempt to bluff through with BS. And if from the point of view of highbrow traditionalists pandering to practical value is a deal with the devil, well, the devil will take his due. If you sell your courses as practical training then you do have to let the need for useful practical training influence what you teach and how you teach it—and what you research and how you approach it. You cannot have the cake and eat it too.

As I've said, I think that at least up to a point this is good for an academic discipline's soul, not just for its pocketbook. If people hole up in the bastion and raise the drawbridge, refusing to let anything but their own scholarly whims direct their thoughts, then I think they're bound to get intellectually fat and lazy. Tenured professors may personally be able to afford to do that, but if they do then they're really saying après nous le déluge.

Re: Migley and Taylor Petrey Discuss Nibley

Posted: Wed May 27, 2020 3:10 am
by Physics Guy
Symmachus wrote:
Sun May 24, 2020 10:29 am
[The writing of science and engineering majors in my classes] had certain features in common, namely, that they approached every problem as part of a closed system, i.e. a well-defined set of evidence to which a set of rules could rationally be applied in order to reach a sound conclusion. The fun was that history doesn't work that way.
That's a very interesting observation, because science and engineering don't actually work that way, either, but undergrad students are indeed taught in a way that makes them think like that. Unlearning the assumption that every problem comes in a box is the main task of grad school. I'm amazed that the modular thinking of science students showed up so clearly for you, because I would have expected it to get laundered away by the immersion in another discipline, but if the target was obscured you still hit the bullseye bang on.

For four years of undergrad students do weekly homework assignments that are always about applying the concepts from that week's lectures and that can always be answered correctly within a few hours of intelligent work. They show up in labs to perform an experiment within a three-hour time slot using equipment that is already laid out on the bench, following a procedure that is guaranteed to produce data that can be analyzed correctly over a couple of evenings. They write exams that last exactly three hours.

Nothing is open-ended in undergraduate science and engineering courses. Or at least, very little. I had one course in which we had a whole semester to design and build our own simple experiments, starting from nothing—to the point where I was buying my own parts from a hardware store—and with minimal guidance. What my experiment actually did was theoretically trivial, but the experience of making it all work myself was one of the best learning experiences I've ever had. If it had happened a year earlier I would probably have become an experimental physicist instead of a theoretician because it was a lot more fun than those stupid canned lab exercises.

Research and development aren't like typical undergrad training at all, of course. Nothing tells you what models to use or what factors are relevant and when you launch out with some assumptions there is nothing at all to stop you from stepping on a mine that blows away six months of work.

Nevertheless we keep on training undergrad students as if that weren't so. I think that first of all it's because we can. I think that science is more modular than humanities scholarship, and even though the separate little boxes are lies, they're pronounceable utterances, so we can say them straight-faced. It's harder for me to even imagine how one could meaningfully divide up a freshman history course into weekly topics. "This week we'll cover the effects of technological change in society. It will take 90 minutes." Right. Science is modular because science is the study of simple things that allow modular treatment.

Secondly, though, I think it's because learning modular approaches isn't really debilitating in science. They mostly don't work in the real world, but what does work is to keep on trying them, finding out why they don't work, and trying again. Eventually you find something that does work. And there's not much chance of avoiding all that trial and error, though the right approach often seems obvious in hindsight and you think you could have avoided all the failures if you'd only been smarter. But no-one is that smart. So the undergrad experience gives a false impression of how easy things are, but the tools it provides really are the ones that one will later apply.