Tags

, , , , , , , , , , , , , , , ,

By now, anyone who follows internet journalism has heard of science journalist and general intellectual wunderkind Jonah Lehrer’s resignation from his new post at The New York Times last Monday (7/30) following charges of plagiarism of his own work, as well as fabricating quotations from Bob Dylan in Imagine: How Creativity Works, his latest book. I won’t rehash all of the details here — if you’re unfamiliar with the story, Josh Levin’s article over at Slate does an excellent job of parsing out the details while also offering thoughtful analysis of the affair. You can also check out two articles over at the Daily Beast: Howard Kurtz interviews Michael Moynihan on how he discovered Lehrer’s fictitious Dylan quotes, while Jayson Blair (yes, that Jayson Blair) compares his own experience as a plagiarist with Lehrer’s situation. And, perhaps unsurprisingly, the depth of the problems continues to increase, as Joe Coscarelli highlights over at NY Mag.

What I want to discuss here is a notion that Levin also highlights, namely the way Lehrer seems to have buckled under the pressure of constantly producing original and thought-provoking insights on the relationship between everyday culture and scientific advances via his blog, Frontal Cortex. Levin puts it brilliantly: “A blog is merciless, requiring constant bursts of insight. In populating his New Yorker blog with large swaths of his old work, Lehrer didn’t just break a rule of journalism.” I’m a little less enthralled with Levin’s following comments, which close the article: “By repurposing an old post on why we don’t believe in science, he also unscrewed the cap on his brain, revealing that it’s currently running on the fumes emitted by back issues of Wired. For Lehrer and The New Yorker, the best prescription is to shut down Frontal Cortex and give him some time to come up with some fresh ideas. The man’s brain clearly needs a break.” I also see the wisdom in Stuart Kelly’s commentary in the London Guardian on Lehrer’s responsibility, as a science writer, to provide well-documented facts: “”If we cannot trust him to transcribe a quotation, why should we trust any of his scientific speculations about what the nucleus accumbens and the ventral striatum are up to?” I would add a caveat of sorts here, though: as a science journalist, Lehrer’s job is to inform the public of interesting advances in modern science and to explain their meaning for a lay audience — not to provide research-journal-worthy analysis of scientific experiments. If anyone reading his various blog posts or even books is taking him as a supreme authority on neuroscience, they’re missing the point of his work. Still, the points are well-taken; Lehrer screwed up, and he’s going to pay for it. And I can understand Levin’s frustration with a fellow journalist for cutting corners and engaging in unethical practices in order to meet deadlines and inflate his public image. Writing — at least, writing well — is a difficult process, as many have already noted.

However, although I do not endorse Lehrer’s methods in any way, I find it difficult to cast my lot in with the “tar-and-feather-him” crowd entirely. Journalism is a demanding field, and every year hundreds of thousands of journalists plug away at their keyboards, the vast majority of them striving to present accurate information in a concise and ethical manner. Whether it’s your job to cover the minutes of the Nowheresville, Middle USA city council meeting or to provide in-depth analysis of what effect Bernanke’s latest speech is having on gold futures, you have to package a lot of information into a relatively compact set of words, which should be arranged with perfect technical and mechanical precision, all while working under tight deadlines. Cut corners, miss a deadline, or produce sloppy work, and your editor (if she or he is paying attention, at least) will give you a wake-up call. Keep up the poor work, and you’ll get canned. Even so, I suggest that Lehrer was under a different kind of pressure than most — the pressure to prove his worth as both an intellectual dynamo and a creative powerhouse.

In his essay “Late Bloomers: Why Do We Equate Genius With Precocity?”, Malcolm Gladwell writes that “Genius, in the popular conception, is inextricably tied up with precocity – doing something truly creative, we’re inclined to think, requires the freshness and exuberance and energy of youth,” citing Orson Welles, Herman Melville, T.S. Eliot, and Mozart as examples. But this equation is hardly universal, as he notes; While Picasso’s early works are generally seen as his best, Cézanne is another matter altogether. Many of his finest works were painted when he was in his mid sixties. Throughout the essay, Gladwell references research conducted by University of Chicago economist Ben Galenson, who suggests that creativity can be divided into two categories, conceptual and experimental. Conceptually creative people peak early – they envision great things and mentally reverse-engineer them in order to actually realize these visions. Experimentally creative people do just the opposite; they spend years – decades, even – struggling with various approaches, letting the vision reveal itself through trial and error.

Gladwell highlights two contemporary writers as examples of these categories of genius in action among the literary set. On the conceptual side is noted novelist Jonathan Safran Foer, who encountered creative writing for the first time as a freshman at Princeton, and wrote the first, three-hundred page draft of his celebrated novel Everything is Illuminated following his sophomore year. On the experimental side, Gladwell points to lawyer and writer Ben Fountain, who followed his dreams and quit a high-paying position in the legal profession to spend nearly two decades writing his first book, Brief Encounters with Che Geuvara, which received massive critical acclaim and launched his career as a journalist and writer, in his late 40s. Gladwell notes that economics play a pivotal role in the way we define “genius,” while simultaneously ignoring the intense personal and sometimes even financial pressures that are often at work behind the scenes. Safran Foer began his career as a writer without pressure of any kind, simply pursuing a unique personal vision that materialized rather quickly, while the latter spent the better part of twenty years depending largely on his (extremely supportive and devoted) wife, who had a very high, very stable income. Their biggest sacrifice, as she notes via Gladwell, was to forgo buying BMWs in favor of Hondas, unlike her coworkers and their neighbors. In short, conceptually creative people have a much easier time finding financial success early on, while many experimentally creative people never have the option to do so, having to abandon their creative projects in favor of survival jobs that eventually become necessary careers.

It seems to me that the reading public readily and eagerly embraced Lehrer as a conceptually creative genius, one whose ease at synthesizing meaning out of the often seemingly disjunctive relationships between the world of science and the meaningful stuff of popular culture has even drawn comparisons to Gladwell himself. And certainly, Lehrer himself did nothing to disabuse us of this notion, producing intriguing and provocative material at an astonishing rate. True, Frontal Cortex often presented readers only with brief, unfinished concepts and ideas, but then again, Lehrer was not primarily a blogger – he was an author and a journalist. If he wanted to keep his full insights to himself from time to time for use in another, more fiscally rewarding project, then he was merely modeling behavior that thousands of other professional writers, both freelance and contract, engage in.

And therein lies the rub, in my opinion. If Lehrer had been a basic staff writer for Wired, or CNET, or any periodical publication, I would find it difficult to express any “sympathy for the devil,” as it were. Even the best journalists produce work of varying quality – go to the New York Times and read the editorials section going back a couple of weeks, see how often even a world-renowned economist like Paul Krugman manages to share some sort of blinding insight that opens your worldview to unexpected vistas. For every piece fitting that description, you’ll find a dozen that are quietly and competently mundane, simply reporting an expert’s opinion on one matter or another. But Lehrer operated under different expectations; every piece he produced was supposed to shed light on some awe-inspiring new truth of science’s role in adding meaning and value to human life, every blog post needed to highlight the meaningful nature of various otherwise arcane and esoteric (to the laymen amongst his readership) neuroscience experiments. Producing mundane insights was not an option for him; it was brilliance or nothing, at least as far as his readership was concerned.

The alternative – taking a break, pausing to catch his breath between books, speaking tours, and blog postings – was career suicide. Go to Twitter and run a search for #cogsci; now tell me, how far out of the loop would a cognitive science journalist be if he or she took a couple of months away from RSS feeds, social media discussions, and industry news sources? No one makes or maintains a reputation such as Lehrer’s by writing thoughtful and insightful commentary on the importance of a particular scientific study, six months after the release of the initial report. True, this sort of commentary may actually be superior, in the eyes of some, but again – writers of this sort (and I would put myself in this category) are not judged by the same criteria. In a world where my smartphone can tell me what one of the Kardashians had for lunch thirty seconds ago, who cares about well-reasoned reports about half-year-old events? There are many thinkers and readers who do, of course, but they do not factor into the equations of the market that depends on writers like Lehrer.

Make no mistake — I present my thoughts here not as a defense of Lehrer, but as an invitation to critique the system that creates the context for the sorts of decisions that Lehrer made. I condone none of Lehrer’s actions; certainly, as a devoted follower of St. Bob, I am appalled that someone would fabricate statements and claim that they were initially muttered by Dylan. And I am not interested in relieving him of agency in making these decisions; as Moynihan's further research suggests, Lehrer had ample opportunity to correct his path along the way, but chose not to. In the end, though, I have to place a large portion of the blame on those readers and content consumers who demand that the so-called “creative geniuses” among us perform not only “on command,” but also at a pace and level that are cognitively unsustainable, to say the least. If Lehrer needed to take a break to collect his wits, well… maybe we should have let him have on.

Advertisement