Friday, October 23, 2015

The Queen: An Interview

In this episode we bring you our interview with Robin Queen. Based out of the University of Michigan, her work on language is situated at the intersections of language, gender, media, and cognition. In a nutshell, Queen endeavours to explicate the details of how our “mental representations of the social world” crisscross with our “mental representations of language.” Our discussion largely focused on the issues surrounding the use of media data for linguistic inquiry.
Aperitifs: As usual, I’ll point out a few gems that surfaced In the course of our conversation. First, the practical: Recently I’ve come across a fair amount of research that makes use of media data to answer questions that - in my humble opinion - are not answerable using this source. As such, I was keen to pick Robin’s brain on effective use of the media as a body of data. (Check out her recent book on the topic of media, "Vox Popular: The Surprising Life of Language in the Media".) This part of the conversation contains some of the most practically (as well as conceptually) useful tidbits for the working linguist. Principally, Robin emphasizes that the linguist must be honest with themselves about their priorities, as well as the scope and limit of their approach. Put simply, they must ask whether their research question can be appropriately answered using media data; recognize the limits of this type of source, and refrain from pushing beyond what is answerable. They must also ask themselves why some particular piece of media has grabbed their attention; does it faithfully address the topic they are pursuing? Both of these lines of thought depend on having first framing a scientifically tractable question that reasonably captures their interest.   Second, the conceptual: We then explored how the media, as a commercial enterprise, contrives to portray a version of reality that reflects their commercial interests. Even the most cursory consideration of this fact invites a critical assessment of the quantity and quality of variation portrayed by the media, such as what kinds of linguistic variation are being presented by today’s media, and what exposure children receive to linguistic variation. Pertaining to children’s exposure to linguistic variation, Robin referenced a chapter by Rosina Lippa-Green entitled “English with an accent” which explored how heroes in Disney films use standard American English whereas non-heroes use non-standard varieties. Robin points out that although there are some online discussions pertaining to this topic, they are anecdotal in nature, and that in fact very little experimental work has been done on what (if any) effect this has on our perceptions of non-standard varieties of English. Third, the question: Finally, I leave you to mull over this idea at the intersection of language and cognition: How does media’s portrayal of linguistic variation reflect, or affect, our interpretation of the personality traits of those characters (i.e. introversion/extroversion, mental health, optimism/pessimism)? À la prochaine!



Note: This interview / post was conducted & composed by Selena Phillips-Boyle.

Thursday, May 7, 2015

The Phoneme: An Interview With Elan Dresher

In this episode, we're speaking with Elan Dresher, professor Emeritus at the University of Toronto.



Two things stand out from this interview like a sore thumb. The first is something I said which was at best controversial, and at worst just plain wrongheaded. The second is something which it seems to me was left tragically under addressed.

(1) The Structure of the Phoneme
  
The basic idea is simple (and has its origins in Fodor's Hume Variations) : the notion of “contrast” is of significant vintage in phonology, having its origins in such sources as Sapir, Jakobson, and the Prague school. And yet, it’s also been a central notion in generative grammar where a great deal of other structuralist notions have been eschewed.

In some ways, the idea of individuating (psychological) entities in virtue of their contrasts in a given schema is more in tune with the Pragmatist tradition, and indeed it seems rather isolated in linguistic theory to phonology and lexical semantics.

Thus, the obvious question : why should we continue to act on the belief that the contents of a mental particular is just whatever possible contrasts that it can sustain between itself and every other mental particular in a given system?

As far as I know, (and I know very little) it’s just not a question that bothers the phonologists in my circles -- even phonologists who are committed to explicating their corner of the language faculty in terms of a naturalist, realist psychology of language, who would otherwise consider themselves anti-Pragmatist.

To respond to this question in good faith would mean not just arguing convincingly that if phonology were to pivot around the notion of contrast then the data of externalization is explained with a neat formalism, but also that the concept has some independent theoretical motivation.  

As has been noted elsewhere, a number of questions confront the concept of contrast. Two of them are:

a) Holism: If a unit in a schema is defined solely by its relationship to every other unit in the schema then how can it be learned individually -- since grasping its character is a matter of grasping the relationships it has with all of the other units.

b) Substance: Concretely speaking, what is it that speakers contrast -- abstract mental symbols? acoustic features? articulatory features? 

(2) Phonetics-Phonology Interface

Although we briefly discuss the relationship between the two domains in this interview, we really don’t do the topic justice. For those listeners interested in an in-depth analysis, a good place to start might be Thomas Purnell's (2009) Phonetic Influence on Phonological Operations

One issue left unconsidered in this interview is that of the relationship between acoustic cues and the mental symbols they typically token in speakers. As Purnell observes, the relationship between these two entities is neither direct nor predictable simply on the basis of the acoustic information. To see this, consider the observation that acoustic properties are just the physical properties of a stream of sound -- they are layered, continuous, multitudinous -- while the mental symbols they token are in important respects atomistic and categorical. Acoustic cues are quite variable from speaker to speaker, as well as within a speaker, yet the behaviour of interpretation is astoundingly robust. 

Moreover, a single mental symbol (phonological feature) may be tokened by different acoustic cues. This suggests that the ground separating phonetics from phonology has some depth worth considering. 

In essence, this just recapitulates a very old argument about the origin of structured knowledge. Empiricists hold that the origin of structured knowledge is situated in the environment, which the mind reflects; rationalists hold that the origin of structured knowledge is situated in the mind itself, and whose final form is the product of an ongoing interaction between innate schemas and the raw stuff of the world. Doubtless the distinction as described above is whiggish, but it's enough for present purposes.

That's all for now.... 

....

Next Episode: We're speaking with the University of Michigan's Robin Queen. Stay tuned. 

Wednesday, February 11, 2015

sociolx intersects the pb

This week on the pb we dive into the fabulous realm of sociolinguistics. We sit down with Naomi Nagy, professor at the University of Toronto. Her work within the variationist paradigm seeks to understand how languages do and don’t change over time, primarily by looking at languages in contact situations.

For me, two things stand out from this interview:

(1)
Throughout our conversation, Naomi places great emphasis on the value of interdisciplinary work. She discusses how the aim of her dissertation work was not only to collect data on a small, minority language, but also to make contributions to greater questions of language structure and function. She suggests that more dissertations should collaborate in different domains, being supervised by professors in different areas of expertise (for example, sociolinguistics & syntax). Naomi explores the notion of “hybrid fieldwork”, and the strengths that differing methodologies bring to addressing the same questions. Many within the field would agree with the idea of working towards more integration and collaboration across linguistic disciplines. However, serious discussions remain to be had about how to implement these kinds of projects in a sustainable way that would enable us to address the bigger questions of language and mind.

(2)
Every field has starting assumptions from which they work, and sociolinguistics is no different. Current sociolinguistic work is challenging several of these fundamental underlying assumptions. For instance, in this interview Naomi challenges the idea of viewing prototypical speakers as monolingual: most people speak multiple languages, and are therefore simultaneously members of different linguistic communities. As such, a single speaker could be progressive in their use of one language, and conservative in another. An oft-cited example of this is women’s language: a great deal of recent variationist work shows that young women are linguistic innovators in, for example, English (see this article by Chi Luu for a popular description of this phenomena). Conversely, sociolinguists who work on language contact varieties see that it is the woman’s role to maintain the heritage language and culture in the home. This leads to a tension between our conception of women as language innovators (in, for example, the monolingual, English world), and women as language conservators (in, for example, the homes of heritage language speakers). Thus follows Naomi's research question: “Are they going to innovate in one language and be conservative in the other?” This tension begins to be resolved when we take into account the bilingual nature of individual speakers.

Note: however counter intuitive it may seem to some, this work reflects another point of agreement between generative grammar and sociolinguistics: as Noam Chomsky has stressed, the idea that a speaker / hearer instantiates a single grammar is an idealization for the purposes of particular kinds of inquiry. In point of fact, a number of researchers within the generative framework maintain that real speakers extract numerous grammars from the primary linguistic data. 



So if this blog is about language & mind, how does all this talk of language variation and contact intersect with cognition? Provided the sociolinguistic effects that the field has catalogued over the past 50 years, it is possible, for example, to inquire whether individuals with (acquired or innate) cognitive disorders display the same sensitivity to sociolinguistic factors as neurotypical speakers, and how that sensitivity is expressed. Such an enterprise would be a natural extension of Eric Lenneberg’s foundational foray into language development under conditions of adversity. Stay tuned for more on this topic in upcoming posts, including our interview with Robin Queen wherein we delve into her work on language contact and variation, and explore how cognition plays a role at this intersection.

Thursday, January 22, 2015

[yourdiscipline] is really just [mydiscipline]

if you’re involved in any discipline concerned with the nature of language and mind, this line probably sounds familiar. If you’ve been in the scene for essentially any length of time, a thick, bony cartilage has probably developed around any part of your psyche that may have ever taken such pronouncements (which are re-issued virtually every year in some form or another) at face value. 

Internal to linguistics, something like this sort of logic usually concerns the passing of particular phenomenon between (e.g.) syntax, phonology, semantics, and pragmatics. (I recall a number of students at the 2013 Linguistic Institute who were thoroughly scandalized by Sam Epstein's observation that word order is evidence about articulation, not syntax. Alternatively, I’ve got at least a couple of phonologists in my circles that are always trying to explain to me how this, that, and the other syntactic process is really just derivative of prosodic considerations). This kind of topic-shuffling can be highly productive and much of the time it is an indicator that the field still has a pulse. However, too often it is a reflection of flash-in-the-pan trends and academic politics.

External to linguistics, something like this sort of logic concerns the division of labour between neuroscientific and psychological inquiry.

Today, we’re posting our interview with one computational neuroscientist that seems to make both sides of the aisle sit a little easier, all the while maintaining a substantive proposal for integrating linguistics and neuroscience. Most refreshingly, he's challenging the [yourdiscipline] is really just [mydiscipline] rhetoric that's been so pervasive in the neuroscience of language. Below is a chat we recorded with David Poeppel back in December 2014.





A couple of things to note:
  • You can find more Poeppel & Co. over at the spectacular blog, talkingbrains
  • Poeppel argues that the right level of abstraction for the basic unit of computation is the neural circuit (see for instance his Towards a Computational(ist) Neurobiology of Language); This would seem to be, at least prima facie, in contradiction to Gallistel's recent sermons in which he argues that the basic unit of computation is intraneuronal. Perhaps there's no contraction for these two researchers and these differently sized units of computation are complementary -- however I didn't catch this difference in time for the interview. Perhaps you have some thoughts on this? 



Notes, Admissions, Qualifications, and Apologies: 
  1. the title of this post is lifted from Laura Howes tweet under the briefly (but thoroughly) trendy hashtag #ruinadatewithanacademicinfivewords
  2. I have no idea whether I am pronouncing "Poeppel" correctly, having neglected to confirm that during the interview. If I've screwed it up entirely, all apologies.
  3. I mispronounce the word "incommensurable" for the first third of the interview. I can live with it.

Monday, December 15, 2014

It's happened.

I've become that person to whom people send the twitter accounts of dedicated empiricists, and connoisseurs of a certain Dalhousie U philosopher's potpourri.

One could become quite distraught about becoming the neighbourhood exorcist... after all, the job is confrontational and messy, and life is short. On top of all of that, linguistics already has a full egg carton of exorcists.

In any event, I've decided to take it in stride. If linguistics doesn't have any more space for self-styled exorcists, perhaps it could use a few more rabbis?

///

More to the point of this post, I've decided to take a much needed holiday for the remainder of the month.

Beginning in early January, the PB team will return to posting interviews with the new & the hip, starting with none other than David Poeppel.

In the meantime, check out this fucking blog: {http://stronglang.wordpress.com/}

Excerpt from stronglang:

"If vulgar language offends you, then thank you. You’re one of the people who help maintain the effectiveness of vulgarities. You will very likely be offended by the articles in this blog.

If vulgar language interests or entertains you, or is a constant source of solace or release, and if you like language in general, then come on in.

This blog gives a place for professional language geeks to talk about things they can’t talk about in more polite contexts. It’s a sweary blog about swearing."
.
.
.
Until January, Comrades.


Saturday, November 15, 2014

Touring the Language Faculty: An Interview with Norbert Hornstein

did october happen? it seemed to careen right passed me into mid-november. despite this unforgivable betrayal by one of my favourite months of the year I did manage to pull off a mighty fun interview with syntactician, philosopher, and fellow-blogger, Norbert Hornstein.

I originally met this chap when I attended his syntax course at the LSA summer school some years back. listening to him speak on the topic, one is apt to get the feeling that generative grammar is building a cool mad max death truck out of scrap metal and wishes (to borrow a phrase from my flatmate). this is something that is often missing from the average lecture on the topic of generative syntax wherein one couldn't be faulted for getting the impression that the field is trying to do philology with both hands tied behind their back (methodologically and theoretically). Norbert is no philologist though, neither on his blog, the Faculty of Language, nor in this interview.

as usual, I'll take a quick dip into something raised in the interview that caught my attention.

during the latter half of the interview (about 48m) Norbert mentions the distinction between linguistics and philology. elsewhere in his writings, the distinction is made by appeal to such notions as explanation and description. I think that Norbert is right to point out that often enough the concerns motivating a programme of research aimed at a faithful description of a language are orthogonal to those motivating a programme of researched aimed at discovering the organizing principles which underlie language tout court. nevertheless, there can also be a palpable tension between the two. consider for instance, the levels of theoretical adequacy demarcated in Radford (1982):

"a grammar of a language is observationally adequate if it correctly predicts which sentences are (and are not) syntactically, semantically and phonologically well-formed in the language.

a grammar of a language is descriptively adequate if it correctly predicts which sentences are (and are not) syntactically, semantically and phonologically well-formed in the language, and also correctly describes the syntactic, semantic and phonological structure of the sentences in the language in such a way as to provide a principled account of the native speaker’s intuitions about this structure.

a grammar attains explanatory adequacy just in case it correctly predicts which sentences are and are not well-formed in the language, correctly described their structure, and also does so in terms of a highly restricted set of optimally simple, universal, maximally general principles of mental computation, and are ‘learnable’ by the child in a limited period of time, and given access to limited data."   

notice that there is a conflict of interest between descriptive adequacy and explanatory adequacy. the former is in a permanently taxonomic mood, and is primarily driven to record, sort, and occasionally predict particular language forms (and meanings); whereas the latter is in a mood to gloss, to provide the rules in virtue of which languages contain the forms and meanings that they do. and the specific pairings between form and meaning that they do.

the conflict arises when we try to map a chaotic, constantly changing world in which accidental and principled variation are observationally indistinguishable to the world of intelligible theory in which consistency and evaluability are supreme values. to my knowledge, the conflict between the two was first noticed by the ancients. for them, a crucial problem was how to relate the sophisticated geometrical and mathematical models of the time to the chaos of worldly phenomenon such as motion. Galileo was really the first (again, to my knowledge) to show the possibility of applying the concepts of geometry to the highly variable phenomenon of motion. relatedly, Bacon was the first to be recognized for proposing a mode of inquiry for, inter alia, discerning accidental variation from principled variation. namely, to carry out experiments which contrive experiences. the virtue of carrying out laboratory experiments is that it is possible to discover crucial discrepancies in theoretical prediction which can be used to hone in on the essential nature of a thing.

(returning to language & Radford's levels of adequacy)
feel free to substitute whatever variable that concerns you besides the syntactic, and whatever metric by which you'd like to evaluate well-formedness. but notice that the problems of marrying your descriptive analyses with your characterization of the abstract grammar doesn't go away (whether it be a grammar of gesture, or social relations, or morals). this is because psychological (to say nothing of theoretical) objects, grammars among them, are necessarily normative while the data is decidedly not. that is, grammars characterize a set of things which a given speaker (or speech community, if you really insist) will find well-formed with respect to form and meaning. so even a sociolect (a dialect in which linguistic varieties are correlated with sociological factors) is a kind of grammar in virtue of which speakers sort sociolinguistic forms and meanings into the well-formed and the ill-formed.

ultimately then I suppose it wouldn't be too off the mark to encapsulate the tension between linguistics and philology as a tension between accounting for forms (and meanings), which are quite varied and diverse, and accounting for the sense of well-formedness, which is largely stable and shared commonly amongst all humans.

caveat: the centrifugal force between philology and linguistics is, as any sensible researcher would acknowledge, quite often counter-balanced by a centripetal force between the two disciplines. specifically, philological projects set a baseline which any linguistic theory must meet if it is to be observationally and descriptively adequate. symmetrically, theoretical work provides the intellectual scaffolding by which philological work can proceed (think metrics of simplicity; criteria of sorting words into classes, languages into families, and the like; the very decisions about what is important to put in your taxonomy and what is not).


The End.


Notes, Admissions, Qualifications, and Apologies:
  1. Radford, Andrew. 1982. Transformational Syntax. Cambridge: Cambridge University Press. 
  2. Apologies for the odd clicking that starts at about 35 minutes. we are working on making sure that stops happening. if you have any insight as to where this clicking is coming from or how to get rid of it we would be very grateful.


Saturday, November 1, 2014

Science—Like The Shape of Bras—Changes Over Time

A few weeks ago, I had the good fortune to catch up with my good friend, and historian of science, Benjamin D. Mitchell. We have oft carried on lengthy arguments about the politics of science doing while he was working on his PhD at York University, and on this latest occasion I couldn’t resist making a brief transcription for the PB.

For some context: B.D. Mitchell is one of Canada’s foremost contemporary experts on Nietzsche and psychological controversies in the late nineteenth century; scholar of the periodical press & the popularization of science in the pre-WWII era; and lecturer at the University of King’s College (Halifax).He is also Editor-in-Chief of Beyond Borderlands, a critical journal of the weird, paranormal, and occult.

I think contemporary scholars of mind should be concerned with the history of the sciences not only because it offers us case studies about how and why progress & regression occur during the process of inquiry, but also because the political economy of science, which cannot be understood without a historical knowledge, is a monumental influence on our lives as brain-workers: from federal science policy, to the structure of our professional societies, down to the office politics that shape our teaching (and learning).

*Caveat for the Q/A: It is my perception that B.D. Mitchell’s perspective on science reflects the sensibilities of a historian, whereas my own sensibilities (and maybe yours) are that of a practitioner. In other words, Mitchell is often wont to bracket the truth/falseness of a particular belief system as part and parcel of his mode of historiography. This is all to the good within that domain. But the working linguist needs their "cheques" to cash at the end of the day, and if history can help make that happen, then good. If it’s not false, great; if its true, even better. (To put it in a less flowery way: the working scientist must un-bracket the truth/falseness of the belief systems that are available to them if they are to make progress in the sciences). This is often the cause of great tension between scientists and historians/philosophers of science, as you will likely experience in reading the Q/A below.

Embrace the tension; it will enrich you. happy reading:

~mb~

most scientists-in-training aren't obligated to study the philosophy or history of the sciences. this has lead to a number of issues in science-doing that we've chatted about before. if you could give one piece of advice from the history or philosophy of science to the contemporary working scientist, what would it be?


~bdm~

Keep your doors open, physically and metaphorically. Recognize that there is a social element to your science. The best scientists have historically been those who were the best at listening in to the larger discussions going on around them, and seeing how their own specialties could be productively applied within these larger discussions. The “reclusive scientific genius”, from Galileo, to Newton, to Darwin, Tesla, and Einstein, is more of a rhetorical device that devotees use to surround their intellectual heroes with an air of worship than an actual condition of their thought and work. They were not alone, just as you are not alone. They do great things because they are greatly interested in the world, both in its most mundane sense, and in its most exalted. That is all.

~mb~


how has scientific discourse changed over time?

~bdm~

I think that one of the biggest changes between the scientific discourse of the 19th and 20th centuries has been in terms of how the changing bureaucratic structure of financial rewards that scientists received for their work influenced the teaching, style, and intended audience of scientific writing.

The less prestige science had, and the more informal the teaching of science was, the more those proposing controversial scientific theories had to write well and for a mixed audience, appealing to both the specialists in their fields, and potentially high profile public backers and policy makers.

Thomas Henry Huxley wanted scientists to be both financially rewarded specialists and the new cultural elites capable of shaping public opinion. Yet arguably, the development of funding bodies and formalized teaching institutions throughout the nineteenth century led scientists to gain greater internal prestige and monetary incentives at the cost of sequestering themselves away from the very public that Huxley saw as the basis of securing the financial freedom and cultural importance of the scientist.

His victory was a partial one that would have profound implications for the relationship between science and the media. The varied interests of popular journals, newspapers, radio, and television has remained more or less steady, what changed was how the scientists themselves interacted with these forms of media.

While there were many important scientific popularisers in the 19th century, there were also plenty of practicing scientists whose professional writings were also targeted at a popular audience. It’s not that the popular media itself has changed, what changed was the reasons for scientists to actively participate in broader discussions about science, and the range of venues in which such discussions happened.

~mb~

how has the scholarly/popular perspective about the relationship between language & thought changed over time?


~bdm~

I think that in the history of the study of language we see several interpenetrating traditions that circle around some fairly fundamental questions: do we create language or does language create us? Are the limits of language the limits of thought? How does language relate to the world? Where does language come from? What is common about language? Where are the differences?

I say that the various philosophical, religious, cultural, etc. traditions that have thought about language are interpenetrating because no society has ever just had one answer to these questions. They’re not dichotomies so much as they are continuums. Because of this there is no one arrow of change, but a web of interrelated changes.

Despite this, starting around the time of the modern research, university disciplinary trends seem to be increasingly set on turning these questions into dichotomies for the purposes of teaching them in a formalized manner that could be used to process an ever growing number of students. In this regard many of the problems facing the study of language are the problems of modern professionalisation more broadly. That makes it difficult to talk about how the discussions differ between scholarly and popular perspectives, for part of these discussions are what makes this dichotomy in the first place. Here I’ll refer readers to Tuska Benes' In Babel's Shadow: Language, Philology, and the Nation in Nineteenth-Century Germany and William Clark's Academic Charisma and the Origins of the Research University.

One of the most important consequences of this is that questions of the relationship between language and thought are caught up in the problems that plague debates about the relationship between the subjective and objective more broadly in science and society. This is one particular point at which the study of language stands to gain the most from observing trends in the history and philosophy of science, which has been trying to wrestle with these issue for a very long time. See, for instance, Lorraine Daston's and Peter Galison's work Objectivity.

~mb~

what ought to be the division of labour between metaphysics and epistemology in the study of mind?

~bdm~

I think that epistemology is what allows us to understand our limits, while metaphysics is how we act creatively within those limits. Anything deserving the name of knowledge requires both. The error, and the conflation of the two, comes from thinking that we can use epistemology to come to any one certain and specific answer about the structure of the world, or, in this instance, of the mind. What epistemology can do is bring us consistently to a place where we can realize the necessity of having a metaphysics, but not the content of those metaphysics.

We can lament and gnash our teeth at the uncertainties of our finite existence, or see ourselves as skilled and living artists capable of producing whole ecologies of knowledge and meaning. This needn’t lead us to the boogyman of an “anything goes” style relativism, but a more refined relativism that can show us how there are still many important and shared structures and forms of evaluating the world that are common to the human, even if we can never prove that they are transcendental absolutes. And this is a good thing, for the absolute is inimical to life; it’s incapable of motion or growth. Epistemically, an ecology of absolutes is a monocultural wasteland, and no ecology at all.

~mb~

often in our conversations you invoke the voice of Nietzsche. what do you think it entails about the nature of our minds (language & thought) that it is possible to reliably adopt the style of reasoning and language of another person?


~bdm~

*laughs* I can’t claim that it’s ever reliable to adopt the style of reasoning and language of another person. Indeed, I would warn against thinking that, or of adopting any one other person’s ideas too completely, but it is productive to study some things, or people, deeply. You have to be aware though that what you study changes you. It can be incredibly enriching, but also limiting in its way. I think of it as a process much akin to aging, or at least aging well. Again, you’re finite, so you have to make choices, and those choices leave their mark, because you have a history.

I guess what I am trying to say is: be careful what you research!

.

.

.

The End.