Tags
Academia, Antonio Damasio, cognitive science, Daniel Dennett, Experimentation, Failure, Humanities, Literary Scholarship
Just a few minutes into my dissertation defense, a little over a year ago now, my outside reader pointed to numerous points in my dissertation that called for interdisciplinary efforts to build a “better bridge” between the humanities and the sciences, a phrase I myself borrowed from scholars and scientists such as Thomas Metzinger, Antonio Damasio, and Keith Oatley. Openly acknowledging that he wanted to play devils’ advocate for a moment, he asked why I felt that obnoxiously hubristic scientists should be seen as having anything to contribute to work in the humanities in general, or in literary studies in particular. (I’ll note that this particular committee member is someone who does quite a bit of interdisciplinary work himself, so I immediately understood that this was a genuinely open question). Following the opening question I outlined at the start of this post, the conversation went several different ways, as they are wont to do (and should) in these instances, and again, the person asking this question was presenting as an exercise, not putting it forward was a belief they held.
My response was that, outside of providing access to research and ideas outside the scope of humanities scholarship, I appreciate how those working in the sciences are usually prepared to admit that, after further consideration or in the light of new advances in research and technology, some of their previous assertions and ideas may no longer be viable. I very briefly explained that this is an attitude that humanities scholars would do well to adopt — follow hypotheses, conference and even publish arguments based on these philosophies, and then allow the community to help test it. Instead, the traditional model — develop an intricate and exacting idea, then proceed to dedicate subsequent decades to expanding and defending that idea (often) blindly and at all costs — leads to overspecialization, isolation of knowledge, and a system in which so-called “original” work is all-but impossible to produce, yet is still the standard used for hiring, promotion, and even job retention. (For more on this, look up Jonathan Gottschall’s excellent 2008 book, Literature, Science and a New Humanities).
The example I pointed to at that moment was Antonio Damasio’s acknowledgement, in the first pages of his magnificent 2010 book Self Comes to Mind, that he had recently “grown dissatisfied with my account of the problem [how the brain produces consciousness],” despite 30 years of work on the issue. He goes on to note that there are two particular issues he wishes to revisit — the origin of feelings and the mechanisms that work to construct the self — before noting that his book is simply an “attempt to discuss the current views,” while also identifying “what we still do not know but wish we did” (6). (For an in-depth review / overview of Self Comes to Mind, stop by Ginger Campbell’s always-insightful and fascinating Brain Science Podcast, which I’ve blogged about before, and check out her recent discussion of the book).
This was all brought back to my mind this morning, as my Twitter feed alerted me (via 3 Quarks Daily) to Daniel Dennett’s recent article over on Edge.org. From the very first sentence, Dennett makes it clear that he wishes to revisit a fundamental component of some of his earlier work and discuss the ways in which recent research more or less invalidates some of his original assumptions: “I’m trying to undo a mistake I made some years ago, and rethink the idea that the way to understand the mind is to take it apart into simpler minds and then take those apart into still simpler minds until you get down to minds that can be replaced by a machine” [homuncular functionalism]. In short, his original idea, following the McCullough-Pitts “logical neuron concept,” was that neurons could be viewed as “switches,” turning off / on as required during computational processes. While this was a grand oversimplification even at the time, he acknowledges, recent research has revealed more about just how much so: “each neuron, far from being a simple logical switch, is a little agent with an agenda, and they are much more autonomous and much more interesting than any switch.” Dennett goes on to note that this suggests that the brain is not at all a sort of “well-organized hierarchical control system where everything is in order,” but rather “more like anarchy with some elements of democracy.” He further acknowledges that this enhanced understanding is made possible thanks to two aspects of the current climate in neuroscience research, namely massive quantities of useful data and “bright young people who have grown up with this stuff and for whom it’s just second nature to think in these quite abstract computational terms,” terms and ideas which their predecessors had to struggle to conceive of and visualize 30 years ago.
Dennnett’s enthusiasm for the “happy” fate of his original concepts here, a fate shared by the ideas and concepts produced by others which made his own possible in the first place, is boundless. He’s happy to abandon the old, rigid models, which he feels were still “worth trying for all [they were] worth.” After all, you “go for the low-hanging fruit first,” which in this instance was to assume that brains operated as very simple machines. When that model is proved not to work, the trick is to understand why not, which, he notes, has started to become possible.
I won’t go into the details of Dennett’s revised ideas on the subject, other than to highlight his assertion (joining many, many other cognitive scientists) that increased understanding of brain structure continues to suggest strong links between brain evolution and cultural evolution; his next major project will be to “take another hard look at cultural evolution… [to facilitate] a proper scientific perspective on cultural change.” This perspective caught my eye in particular because it brought me full-circle back to my dissertation defense and certain lines of questioning. Anyone attempting to operate in the field of cognitive literary studies will likely tell you that they have encountered similar questions; some have had their work dismissed, ridiculed, or ignored precisely on the grounds that any literary scholar turning to the sciences for information, methodology, or even both, must be likewise “arrogant” and “hubristic.” And to be fair, one need not look far to find scientists for whom those labels may fit quite nicely (Richard Dawkins, for all the vitality of his early work, springs to mind here for some reason). Yet the humanities are equally rife with folks whose work fits that same description, and in my estimation, these often wind up being the very people that young scholars are expected to look to for role models. The furor and excitement over Stephen Greenblatt’s wildly popular and even more wildly overreaching 2011 book The Swerve is a good example of this. Even as the book went on to win a Pulitzer, the PEN/Faulkner award and, more recently, the MLA’s James Lowell Russell Prize for excellence in literary study, many scholars were left scratching their heads (Jeffrey Cohen provides an excellent analysis of the disparity between the book’s critical reception and critiques leveled against it by a variety of scholars and readers here).
Here’s what I would ask: that young scholars work to model the sort of behavior that Dennett and Damasio do in the instances I’ve highlighted here. In other words, conduct research and share results in the spirit of experimentation and collaboration, partnering with researchers and scholars in other fields — be that biology, anthropology, chemistry, computational science, neuroscience, music, dance, and so on — to r-econceptualize and re-engineer the fields they work in. And of course, even more important, humanities departments need to open themselves up to this new model as well, working simultaneously to create and sustain an atmosphere of experimentation and collaboration, while also convincing university administrations that the time is long past for abandoning the academic monograph as the sole standard of the importance and impact of one’s research. Last year, the department where I held my postdoc hosted a talk given by a seasoned editor from a storied and respectable academic publisher. Throughout the talk, he advocated for precisely the same thing, arguing that the current emphasis on pushing doctoral candidates in the humanities to produce dissertations that are ready for rapid transitions to monographs, particularly aimed at the elite (read: conservative) publishers is killing the vitality of the humanities. Instead, he urged young scholars and dissertation directors alike to encourage creativity and experimentation, aimed at creating and sustaining broader conversations in which highly specialized monographs are but one of many possible outcomes. For me — at work on broader research that is developing a significant portion of my dissertation project into a manuscript — this blog is one of the ways I can take part in that conversation.
But I would also ask that we all be willing to let ourselves fail from time to time — publish a blog post or even a paper outlining a project that arrived at unexpected or possibly even seemingly unhelpful conclusions — and that we be allowed to fail. As Dennett points out, without decades of work attempting to prove the McCullough-Pitt hypothesis and other similar ideas, the currently evolving concepts of the complexity and plasticity of the human brain would not be possible. Every once-in-awhile, it’s actually useful to consider what failure means. We’re awfully quick to assume or suggest that failure equates to a waste of resources, or even signifies some sort of personal and inherent weakness in those undertaking the task to begin with. “Good” scholars do not ever fail, we are told; “bad” scholars fail and do not receive jobs, promotions, tenure, publication, and so forth. If a particularly strong idea invites a decade of challenges on behalf of hundreds or thousands of academics, I’d like to think the results will be far more valuable and stimulating than operating on the assumption that the same idea is infallible and therefore should not be challenged. As an example, for me, Greenblatt’s recent book is utterly baffling in its assertions and seemingly slip-shod approach to historical research; but if that makes it a “failure” in the eyes of some scholars, who find the book’s claims a compelling reason to investigate not only the critical apparatus which produced it but also the assertions and methodology it displays, then it just might be one of the more important books of the year after all. Let’s start thinking about having the audacity to fail — to explore, to experiment, to challenge, and to experience the rush of excitement that comes from realizing that a failed hypothesis has actually opened the door to new possibilities.
Yeah! Thanks dr Josh. But your PhD dissertation is a real success! I just wish if you can tell us how much time it took you to formulate its hypothesis, and how you cultivated your love in the cognitive turn? Do you think that cogsci can be successfully applied to contemporary fiction, as I observed that the classics with its great language and ideas and especially drama offered up till now a better fodder?
Hi Amy, thanks for the nice compliments and the interest, good to know that my dissertation has found some interested readers! Just briefly, I arrived at the core of my thesis after about 6-8 months of research and outlining; I was looking for a methodology that would aid me in resisting the critical trend, in early modern literary studies, of reading character identity and interiority as (almost) purely constructivist in nature. In other words, I wanted a way to connect my own interest in character consciousness with what I saw as a similar period interest in understanding mental states and interior emotions. Owen Flanagan’s 2002 book “The Problem of the Soul: Two Visions of Mind and How to Reconcile Them” was a helpful first guidepost for me, quickly pointing me to works by Dennet, Damasio, Metzinger, and others, on the way to literary scholarship by Ellen Spolsky, Mary Thomas Crane, Amy Cook, and many others. From there, it took me just over a year to write, research, and edit the final document.
I appreciate your point about the classics / period literature offering particularly useful fodder for cog sci readings, and I think that in many ways, you’re right — some of the elaborate formal structures of older poems and plays are particularly interesting, because they were written in a time before the word “consciousness” took on the valences it holds for us today (modern usage begins about 1632); thus, they use metaphor, simile, and mythology to express their own sense and understanding of interior mental states. However, there is some interesting work going on in (more) modern fiction as well; Lisa Zunshine, for one, has produced some very interesting work on meta-representation in detective novels.
Hi Dr Josh,
This is not a compliment, I’m currently reading your work word by word-I’m in page 50- deriving an enjoyment and a real benefit from it! Thanks a lot for this brief tracking of ideas and the recommendation also. I just have a meticulous question, if you permit. Your PhD is done in about 2 years, and there is a lapse of 5 years between it and the M.A. If the reason is working- as an instructor for example- do you recommend not to do if the person has that option? i really need this piece of advice from an experienced person like you. Thanks in advance.
Hi Amy,
Glad to be of help. I’m not entirely sure I recognize the degree timeline you have here; I went right from my M.A. (2 years) into my doctorate program (4.5) years, finishing in 6.5 years total, which is actually fairly quick by standards of most American doctorate programs. In most cases, if you do not have an M.A. to begin with, you’ll have about 3 years of coursework before you start 1 year of exams and dissertation prep, then about 2-3 years of dissertation writing. It’s not uncommon for some of those processes to get disrupted and lead to delays; I think the national average is about 8.5 years total, or was the last time I looked it up. I did teach throughout my time as a grad student, anywhere from 25-50 students per term, and it’s true that teaching does slow down the writing process – you spend time preparing classes and grading papers that you might otherwise use for writing and research. However, a little teaching is valuable to have on your CV if you plan to apply for teaching jobs, and in most cases, teaching means that the university will waive part of your tuition as well as pay you a small salary (and maybe healthcare, if you’re lucky). If you can afford to get the degree without teaching – if you can pay tuition and living expenses on your own, which combined will likely average $30-$40K / year at most US colleges, then that’s the best way to go – but if you have to take out $20-$25K / year in loans for 6-7 years to avoid teaching, then you’re better off teaching. Either way, you should plan for a path that will take 6-7 years from start to finish, including the M.A.
Hi Dr Josh,
Thanks a lot for your help and bothering yourself to write this valuable advice. I just asked coz you said above that you spent 8 months searching + 1 year writing. So why a 4 year programe; otherwise you mean that this 2 years period distributed over the 4 years. Please accept my dear thanks even if you don’t have time to answer. Best wishes Dr.
Hi Amy,
I think I see your point a little more clearly. When I ended my coursework and completed my exams, I had been working in a different category of study in early modern drama — researching the way early modern writers were concerned that the legal system was not a good system for justice — but was wanting to move in another direction. (I should note that these classes are in addition to those taken for a B.A. or B.S. degree — graduate degrees require specialized classes). So, after finishing all of my classes and completing my exams, I started down a different path with my dissertation, which took a few extra months to get reconfigured. Once I had done initial research and outlined chapters, it took me a little under two years to do focused research, write and edit the dissertation, and defend it. Hope this helps clear it up for you!
Hi Dr,
Yes it did clarify. Thanks a lot. Helpful people like you are making the world a better place. My best wishes.