There’s lies, damn lies and statistics

graph of spurious correlation

Clear correlations “proven” statistically.

There’s a reason I stick with my Twitter feed. It brings up articles that I otherwise would have missed. Take these two.


It’s an interesting re-analysis of a climate change article by Lewandowsky, Gignac, & Oberauer, (2013) presented by Dixon and Jones (2015), who concluded: “respondents convinced of anthropogenic climate change and respondents skeptical about such change were less likely to accept conspiracy theories than were those who were less decided about climate change.” This was in response to Lewandosky et. al. who had shown a robust linear relationship between climate change rejection and conspiracy theory ideation.

The rebuttal by Lewandowsky, Gignac & Oberauer (2015) seems pretty compelling. The essence of the argument is methodological. When do you choose the statistical model to use and believe? The main point that Lewandowsky make is:

“Dixon and Jones’s core argument is that the relationship between the two variables of interest, conspiracist ideation (CY) and acceptance of climate change (CLIM), is nonlinear, and that the models reported for both surveys were misspecified. To reach their conclusion, Dixon and Jones first make three questionable data-analytic choices to cast doubt on and attenuate the linear effects reported, before they purport that there is nonlinear relationship after reversing the role of the variables of interest in the statistical model for the panel survey. No statistical or theoretical justification for that reversal is provided, and none exists.”

So you can choose a different model but if you do, you better have a compelling reason for doing so. Dixon and Jones didn’t.

Hence, Lewandowsky et. al, conclude:

“In summary, Dixon and Jones’s analysis has no bearing on the results we reported for either survey because it reaches its main conclusion only by reversing the role of criterion and predictor without any theoretical justification. The only statistical justification offered for that reversal (“with nonlinear models, it is important to explore relationships in both directions”) demonstrably does not apply. Without that reversal, Dixon and Jones’s criticism involving nonlinear relationships is moot because none are present.”

The main point was elegantly stated a bit earlier in the article.

“Any correlation matrix can be fit equally well by more than one model. This issue of equivalent models has been discussed repeatedly (e.g., Raykov & Marcoulides, 2001; Tomarken & Waller, 2005). The consensus solution is to limit the models under consideration to those that have a meaningful theoretical interpretation (MacCallum, Wegener, Uchino, & Fabrigar, 1993). Alternative models should reflect alternative theoretically motivated hypotheses, any mention of which is conspicuously lacking in Dixon and Jones’s Commentary.”

There are lies, damn lies and statistics. If you are going to use statistics wisely you better have a good theoretical model on which to base you proposed analysis. In the absence of that any conclusion drawn is suspect.

J.B.S. Haldane, “[T]he Universe is not only queerer than we suppose, but queerer than we can suppose.”

Sorry that at least the second response article of these two is proprietary rather than open access. If you have access to a library that carries Psychological Science here are the relevant citations:

The open one with a CC BY NC is:

Ruth M. Dixon and Jonathan A. Jones. Conspiracist Ideation as a Predictor of Climate-Science Rejection: An Alternative Analysis
Psychological Science May 2015 26: 664-666, first published on March 26, 2015 doi:10.1177/0956797614566469

and, this one in rejoinder is behind a pay wall. 😦

Stephan Lewandowsky, Gilles E. Gignac, and Klaus Oberauer. The Robust Relationship Between Conspiracism and Denial of (Climate) Science. Psychological Science May 2015 26: 667-670, first published on March 26, 2015 doi:10.1177/0956797614568432

Posted in Uncategorized | Leave a comment

The University as a Learning Design Problem

UTA_TorchBearersOne of the things I’ve enjoyed in getting to know the community at the University of Texas Austin, is the energy that exists among fellow faculty to rethink the undergraduate experience. This is particularly challenging at a large, research intensive state university.  And it is especially true when such state universities have activist legislatures with a strongly conservative bent.  As Mark Twain once said, ” No man’s life, liberty or property are safe while the legislature is in session.”

The Campus Conversation, an activity started by President Bill Powers in 2014, with three primary goals in mind:

  1. How do we implement changes to curriculum and degree programs?
  2. How do we evolve pedagogy for 21st century learners?
  3. How do we create more opportunities for interdisciplinary and experiential learning for our undergraduates?

I have been drawn to one of the six faculty working groups in particular, that which is addressing the question of establishing a Teaching Discovery Innovation Center.

Teaching Discovery Innovation Center

Focused on the creation of a faculty-led innovation center, this committee is engaging leaders and stakeholders across campus and externally to accelerate UT Austin’s advancement in this area.

We are approaching the point after multiple conversations among an diverse group of interested faculty of proposing a way forward using the method “90 day innovation process” developed by the Institute for Healthcare Improvement (IHI).

I had the pleasure, and it truly was a great pleasure, to spend time with a colleague who I highly regard at Georgetown University, Assoc. Provost Randy Bass, who is leading a marvelous project entitled “Redesigning the future(s) of the University“. They are doing some very critical thinking about the impediments to teaching in ways that are high impact, expansive (in the sense of including both university and the rest of the world in their conduct) and empowering.

The Georgetown Redesign project describe their experiments in curriculum design and the undergraduate experiences as ‘pump-priming ideas’, and they’ve started with five of them.

  1. Flexible Curricular and Teaching Structures
    1. These might include teaching courses in shorter modules and combining modules such that a set of two or three end up being a traditional semester in length. Or it may be unbundling the credits into individual units to better fit into the pacing of a students learning (taking a 6cr. ‘course’ and breaking it into a three 2 credit units that could be taken together or separately).
  2. Competency-based learning – this one is relatively self explanatory.  In Georgetown’s perspective it would consist of one or more of the following elements:
    1. Explicit learning outcomes with respect to the required skills and
    2. A flexible time frame to master these skills
    3. A variety of instructional activities to facilitate learning
    4. Certification and assessment based on learning outcomes
    5. Adaptable programs to ensure optimum learner guidance
  3. Expanding Mentored Research – Programs of study that shift from predominantly formal coursework to a substantially different balance of coursework and credit bearing mentored immersive learning through independent or collaborative projects.
  4. New Work/Learn Models – Programs of study that maintain or expand years to degree but include a substantial experiential component (e.g. workplace Co-op), dependent on Georgetown placement (in DC and globally), and guarantee both degree certification and intensive work experience on graduation.
  5. Four-year Combination BA/MA – Four-year combination BA/MA built around new configurations of online and self-paced learning, coursework and experiential learning.

This is really significant.

MIT has been beavering away at how the undergraduate experience needs to evolve through an Institute-wide Task Force, the product of which was the report  The Future of MIT Education:Reinventing MIT Education together. In it was a recommendation about greater modularity, as well. One of its attributes, in addition to allowing students greater influence on their own learning pathway, is the creation of structural ‘holes’ in the curriculum – making time and places for experiential learning.

“Recommendation 7: The Task Force recommends that this commitment to pedagogical innovation for the residential campus be extended to the world to set the tone for a new generation of learners, teachers, and institutions…..

<stuff removed>

a. Exploration of modularity based on learning objectives and measurable outcomes. In January 2014 Harvard and MIT released a report summarizing an analysis of the data collected during the first year of open online classes. Modularity refers to breaking a subject into learning units or modules, which can be studied in sequence or separately. The finding that drew the most attention is the low rate at which students who enroll in an MITx or HarvardX class complete it. The first 17 HarvardX and MITx classes recorded 841,687 registrations, of which only 43,196 (5.1%) earned a certificate of completion.

“While the completion rate is low, other data from the report suggests that students are focused more on learning certain elements of a class and less on completing what has traditionally been considered a module or unit of learning. For instance, in addition to those who completed a course through MITx or HarvardX, 35,937 registrants explored half or more of the units in a course, and 469,702 viewed some but less than half of the units of a course. The way in which students are accessing material points to the need for the modularization of online classes whenever possible. The very notion of a “class” may be outdated. This in many ways mirrors the preferences of students on campus. The unbundling of classes also reflects a larger trend in society—a number of other media offerings have become available in modules, whether it is a song from an album, an article in a newspaper, or a chapter from a textbook. Modularity also enables “just-in-time” delivery of instruction, further enabling project-based learning on campus and for students worldwide.”

Is there a pattern emerging?

— pdl —

Posted in curriculum_redesign, experiential learning, innovation, interdisciplinary_learning, mission, undergraduate_education, UT_Austin | 2 Comments

AAAS Vies for the Title the “Darth Vadar of Publishing”

AAAS is vying for the crown of Lord Vader or Chief Evildoer in its approach to suppressing cost-effectiveness_open_access_journalsopen dissemination of scientific knowledge, even when that knowledge is paid for by tax payer money in the first place. They claim to support open access. They redefine it to be a pay for publishing charge (APC)  of $3,000 USD and that restricts the subsequent use of the information in the article preventing commercial reuses such as publication on some educational blogs, incorporation into educational material, as well the use of this information by small to medium enterprises. If you really meant open access, the way the rest of world defines it, you’ll have to pay a surcharge of an additional $1,000.  But it gets worse.

Faux Open Access  Journal “Science Advances” (perhaps better “Poor Researchers Restricted”)

A new faux open access journal Science Advances is being launched next year that will, get this, charge an additional US$1,500 above the fees listed previously to publish articles that are more than ten pages long. Wait…. this is a born digital publication with no paper distribution.  They’re charging $1,500 plus the $4,000 to publish an open access article longer that 10 pages. It is bits, right? Their argument is that the freely provided peer review process is more difficult with longer papers so they should charge more for the effort, seeing as how they are getting their reviews for nothing anyway and this is just pure profit – and who doesn’t like pure profit? They claim that the additional ‘editorial services’ justify this additional surcharge.

What about publishing your data along with your paper? Why would enough detail to replicate experiments be important since when we do we often don’t get the original results verified anyway….

The AAAS commitment to open access is worth examining. Besides redefining it so that it isn’t open and accessible at all, they published a widely criticised study claiming the peer review process in open access journal is suspect.  The criticisms were methodological. John Bohannon, a correspondent of Science, published the results of an experiment or sting operation in Science in an article entitled “Who’s Afraid of Peer Review?” to expose the problems of open access publishing. You can read the full text here  – or you could if you have access to the journal Science which is behind a paywall. Those of you from university or colleges with a subscriptions to Science can gain access to it through your library, assuming they’ve been to afford to keep the online subscription. The rest, well, you’ll have to my word for it.

It Never Was an Open Access Study

The problem is it wasn’t study of open access journals at all. There was no population of other journal types to compare it to. He sent a bogus article to a group of open access journals, over 300 of them, from a list on the site the Directory of Open Access Journals. Fair enough. But none were send to proprietary journals and thus never explicitly compares open access model to the subscription model. What he’s pointing out, as the Martin Eve notes in the Conversation (a good article, btw)  that Bohannon is highlighting problems with the peer review process. These are not new and they are a major concern. But they are not related to open access any more than they are related to subscription economic models.

The bottom line is Science and the AAAS is trying to redefine open access to make it compatible with their highly lucrative current subscription model. They are levying APCs of $5,500 per article to “make a report openly accessible.” They’ve published a flawed ‘study’ to justify their actions and put up a website to support their mistaken claims ( They have hired a managing editor for their new ‘open access’ publication Science Advances who is openly critical of the open access movement. They have lobbied the UN to try and pressure the Special Rapporteur in the field of cultural rights at the United Nations, Farina Shaheed who is preparing a report on open access for the UN Human Rights Council, calling open access young (aka immature), experimental (aka risky) unable to demonstrate the benefits that to them clearly exist from traditional reader pays (aka subscription based) publishing models.

It’s time to recognise when a monopoly is trying to consolidate its position at the expense of the very people on whose work its prestige depends. Shame on AAAS.

Posted in open_scholarship | Tagged , , , | 1 Comment


This fall, in North America, a new open course is starting on open learning, the meaning of connection and in what ways is it really possible to engage in distributed learning at a distance.  There have been disparaging remarks at the degree to which innovation & learning is really possible at the scale that massive open courses have achieved. What’s innovative in the pedagogy that characterises learning patterns for at scale that are heavily “designed” and in so doing make the ad hoc small group discussion difficult if not impossible? More bluntly, how is video recorded lecturing to 100k learners an improvement in pedagogical practice?  The question raised is does scale prevent good pedagogical practice?

This is one of many questions I hope to explore with others in the upcoming Connected Courses learning event staring this September.

Posted in ConnectedLearn | Tagged | 3 Comments

The Post Digital Age

Phillip Long, Ph.D.
Institute for Teaching and Learning Innovation, UQx Project
The University of Queensland
10 August 2014
Image by  amattox mattox,,  Some rights reserved (cc by nc)

Image by amattox mattox,,
Some rights reserved (cc by nc)

It’s about time we got there. We started with the model of learners working either independently or in a close relationship with mentors, instructors, or teachers to incorporate from them knowledge, built upon scarce data that only their teachers knew. Knowledge and its power were a direct correlate of what you could remember and recall when the time and place for them was present.

But the growth of facts began to challenge even the most capacious human minds. We took to recording them, laboriously by scribes, onto an external storage medium. The effort made the scarcity costly, adding economic value to the mix. Among the knowledge elite this precious external storage environment was prized and guarded.  Practitioners of the passage of knowledge were seen at times silently mouthing words and sentences in a mystical and to some very frightening practice of ‘quiet reading’. They translated the coded representations of knowledge on the fly as their eyes danced across the storage medium bringing them to express things that those around them knew to be beyond their experience – and scaring the unlearned by the “power” in these new devices…. books.

The revolution of the printing press democratised access to information. It was no longer a matter of the rich and powerful to own knowledge. The transition took several hundred years, but it laid a part of the foundation for the explosion of knowledge that characterised the Renaissance. It also changed the way we think. What once was knowledge by virtue of memory and its recall was now possible to store outside the little grey cells in your cranium. We needed an indexing system to be able to track all of that externally stored information and mechanisms to use it for efficient retrieval.

Various mechanisms emerged that used descriptions of the location of the physical objects (the Persian city of Shiraz’s library, 10th century (1)), the location itself coded by numbers (Library at Amiens Cathedral in France (2)) or Thomas Hyde’s Incunabulum, a printed catalogue of the books in the Bodleian Library, Oxford University.  All of these were attempts to organise and make more accessible to humans the increasingly vast body of information accumulating in the world of knowledge creation.

We have always, as tool making creatures, used our ability to build things to improve and existence. Initially this was focused on survival, but as we became more capable it quickly spread to other aspects of making life easier, better and more fulfilling. To organise our knowledge we built repositories in the form of collections in libraries, and indexed them initially more arbitrarily, and later through a classification schema (e.g., the Dewey Decimal System).


The Jacquard head for dobby looms.

The advent of representing information more abstractly, in terms of binary coding of human readable characters launched the digital revolution. The initial physical manifestations derived from what was cutting edge technology of the period – gears, levers and pulleys gave way to re-appropriation of the loom for separating digitally represented holes as ones and their absence as zeros. The “Jacquard head” for dobby looms provided the in

sight between pattern in the abstract and the Jaquardian weaving that resulted. His key idea was the use of hole punched cards to represent a sequence of operations, leading to the Analytical engine of Charles Babbage and later the card tabulating machine of Herman Hollerith to perform the 1890 US Census.

Tying all this together is the use of technology to augment the human intellect. Fast forward to the end of World War II and the same concern for the proliferation of information and ways to find and use it rather than continue a cycle of rediscovery was expressed by Vannevar Bush in the classic “How We May Think” article printed in Vanity Fair, July 1st, 1945 (3). In it he proposed, based on the state of the art of technology of his day, the Memex, a machine to record knowledge by mimicking the human search process through what he termed Associative Trails. He wanted to record not just the artefacts but the way in which humans thought through the steps that led to the formation of those artefacts. Further he wanted to make these shareable so that others could see not just the result, but the process by which that result was derived.

It took another 23 years before the technology the day could at least attempt a physical implementation of this idea. It was presented to the world in a breathtaking live demonstration at the Moscone Center in San Francisco, California by Doug Engelbart in the “Mother of All Demos” (4).  The demo was an attempt to show, rather than talk about, something Engelbart wrote six years before in the landmark paper Augmenting the Human Intellect: A Conceptual Framework (5). In one 100 min ‘show and tell’, Doug demonstrated the introduction of the computer mouse, video conferencing, teleconferencing, hypertext, word processing, hypermedia, object addressing and dynamic file linking, bootstrapping, and a collaborative real-time editor. Our world was forever changed.

Today we routinely offload memory into silicon. Some refer to this as the dumbing down of our intellect by Google (6) but we’ve been doing it for centuries – we just do it better and more efficiently today. And while it appears we’re mired in our devices, walking heads down into street signs as we text away, or sitting at the dinner table with friends when we’re ‘friending’ people who aren’t present through the devices formally known as phones, we are moving past that era.

What marks this shift? The post digital age is like prior radical transitions – it’s marked by the fact we no longer recognise it as different. Think back to when your parents had an icebox. They replenished it with block ice at least daily. And then something happened. Refrigeration. And in less than a generation we went from astonishment at this miracle, to forgetting the world was different in the time before it.


Zoom gesture comes naturally cc by nc sa Alec Couros,

Look at children playing today with their parents smartphones, or perhaps their own tablet computers. When they walk up to pictures now they try naturally to manipulate them with the ubiquitous thumb and forefinger spread to zoom the image. We walk around with digital sensors measuring our gate, altitude, and velocity and glance at our ‘phones’ (we need a new term for this) to see the dashboard of our activity.

More importantly, we are starting to see and think in ways before we couldn’t before because our devices are shaping what we conceive as questions. In 2011, we began to make ‘movies, by directly recording the impulses from the voxels in our brain to reconstruct the imagery recalled from the memory of movies we have seen (7). We are on the verge of communicating rich media from neural storage to the sensors that pre-process it in others. It’s not long before we have the capability to transfer these memories passing their biological encoding. Will these be ‘memories’ at all without this step? Will we perceive them the same as those created by our own neural infrastructure? We don’t know yet, but we soon will.

As the embedding of digitally enabled devices extends the concept of the internet of things (8) to interconnection of objects in a network with ourselves we silently enter the post-digital age. As David Foster Wallace wrote in 2008, “the most obvious, ubiquitous, important realities are often the ones that are the hardest to see and talk about” (9). Fish don’t see water, but we must.   Welcome to the post digital age.



(2) Joachim, Martin D., Ed. Historical Aspects of Cataloging and Classification, Volume 2 The Haworth Information Press, Binghamton, NY, 2003, p. 460.





(7) Nishimoto, et. al., (2011), Reconstructing visual experiences from brain activity evoked by natural movies, Current Biology, 21(19):1641-1646.

(8) Ashton, Kevin (22 June 2009). “That ‘Internet of Things’ Thing, in the real world things matter more than ideas”. RFID Journal.


Posted in innovation, post-digital | Tagged , , , | 1 Comment

Radically Transparent Research – or Why Publish Before Peer Review?

I was reading the Gardner Writes, as I follow my colleague and friend’s thoughts from the other side of the globe with great interest and anticipation of the thinking he forces me to do. He’s been on a multipart series kick lately, probably to break up a long piece of discursive writing that formed the spine of a report he wrote for his home institution, and thereby make it more accessible and easier to digest.

He made an aside, referencing a blog post by Dave WIner about why he (Winer) writes in public. This resonated particularly at this moment for me as we’re in the midst of writing a grant proposal for a funding body in Australia, one thrust of which is how to break the cycle of the traditional linear research methodology. In engineering education, the domain for which this proposal is targeted, as in most engineering and scientific disciplines, the process can be described as:

Conceive — Design — Implement — Operate — Analyze — Disseminate

(That’s a modification of the engineering methodology some of you may be familiar with from the CDIO consortium)

The point of this is that you follow your experimental protocol, conceptualising the hypotheses derived from a theoretical framework you think informs the work, design the experiment, its methodology to collect data that might inform and or refute it (them), translate the experimental methodology into something that might actually allow you to do the work, then conduct the experiment to collect the data and try out the methods you think will inform you about the questions you posed, analyze it when the experiment is over to see what came out, and then, perhaps with the others involved, you write it up to share it with your colleagues. This is the journal publication process that can take from as little as three to four months, to upwards of 18 months to 2 years. All up the cycle that you’ve just engaged in is a measured in years.

Since all of your colleagues in the community are likewise going through this process to explore their own hypotheses the asynchronous and overlapping time lines for this work naturally leads to the sharing of results occurring through the stages of the cycle. In fact it’s more likely than not that some of this new knowledge is likely to have value to your work, indeed to potentially influence it, and, if you weren’t fixed to the methodology you’re following so that you can get results that are experimentally sound, you’d likely have changed something along the way to leverage the knowledge that you’ve just read and discussed with the authors via email or at that conference you both attend annually.

All up the two year period that it takes to devise, conduct and report on your work IS as much of a problem as any aspect of the work itself. If the point of all of this is to learn and improve the educational process, the chances of doing that meaningfully in our life times is low. After all, many of the outcomes from this cycle aren’t going to be particularly informative – two years to report that you really didn’t find anything significant this time around is a long time in work life of an academic, and even longer in the educational trajectories of our students.

Dave Winer described why he writes his blog saying,

I write to express myself, and to learn. Writing is a form of processing my ideas. When I tell a story verbally a few times, I’m ready to write it. After writing it, I understand the subject even better.

The connection to why he writes and why we publish is similar. But the problem is the timescale. Blogging is relatively rapid. The short cycle time helps us share our thoughts and forces us to consider the things we’re thinking about. And, it invites others to see, consider, and respond to what we’re thinking about.

I write to give people something to react to. So you think the iPhone was a winner from Day One. Great. Tell me why. Maybe I’ll change my mind. It’s happened more than once that a commenter here showed me another way of thinking about something and I eventually came around to their point of view. And even if I don’t change my mind, it’s helpful to understand how another person, given the same set of facts, can arrive at a different conclusion

That’s the discursive dialogue that we’re missing in science, but which has been developed and understood in the open source community for some time. That’s the community from which Winer comes, and the transparent sharing of thoughts and ideas is part of the culture, the method of open source development. And that’s what’s missing from the higher ed learning research community. Our livelihoods are tied to the articles we produce, to the impact factor of the journals in which they are published, the attribution of the ideas to ourselves as the original authors. We sacrifice the enormous potential the community could give us by holding fast to the belief that if we share it openly we’ll see our intellectual contribution devalued, lost, or worse, stolen, by someone else claiming the idea as their own. We worry about our ideas being ‘stolen’ while in reality the fact that we put them out there in the first place establishes our provenance.

Once up on a time, when the mechanism of production and sharing of ideas took huge amounts of capital, significant effort to create the production value we sought in quality work, and complex, costly, and time consuming mechanisms to distribute the work to our colleagues in via the journals trucked/mailed/shipped to libraries in higher ed institutions around the world, the concern about provenance of an idea was more meaningful. Recall the Darwin/Wallace conundrum that surrounded the first published comprehensive expression of the idea of evolution by natural selection to the Royal Society in London. (Not familiar with this extraordinary coincidence of paradigm shifting ideas co-occuring? It’s a fascinating story, and one that owes a tremendous amount to Charles Lyell. Darwin was resigned to be ‘forestalled’ in getting the idea of natural selection as the driving force that works on natural variation (mutation) out to the world as his work. The process by which this was addressed, the compromise that emerged and the thoughtful intervention and guidance of Lyell is a lesson in ethical conduct, friendship and skillful political savvy).

The grant we’re writing is in part an attempt to demonstrate that there is another, more promising way to conduct this enterprise. The open notebook science movement has been around for some years – we are NOT claiming novelty in this. We are simply trying to appropriate the methodology and apply it to the research on learning design in engineering education. In this we’re adapting the work done by others, the most recent of which I’ve been reading is from the blog of Mel Chua. Here she writes

Radical transparency refers to the cultural practices used by healthy open communities (Free/Libre and open source software, open content, and open hardware projects) to expose their work in as close to realtime as possible and in a way that makes it possible for others to freely and non-destructively experiment with it.

From Mel Chua: Hacker. Writer. Researcher. Teacher. Human jumper cable.

We have to take the collective wisdom of the community and carefully apply it more rapidly to improve learning design for higher education students. We don’t want to put at risk the students in our courses, as is often raised as a “show stopper” concern by those who think that unless there is incontrovertible evidence of improvement in teaching approaches that are different, it’s better to stay with what works. But does it? That is, does it work? How do you know? And is there really the risk being asserted? As a colleague and co-writer of this grant once said in a panel discussion with me about the change in his teaching using the ‘flipped classroom’ approach,

Students won’t let you ‘fail’. They will raise concerns early and loudly is something isn’t going well. Any teacher worth being front of the room will respond and change what they are doing to avoid the catastrophe. Things may not go as you planned, but they won’t end up damaging the students in the class because they, and a good teacher, won’t let that happen.

So where is the real risk? It’s in not being open and transparent to learn together.


Posted in Open_Research | Tagged , , , | Leave a comment

A literary aside

I was reading material about the role of certification played by higher education

Zombie literature?
Some rights reserved by e_monk

institutions.  It’s part of the background that I’m doing for a piece on badging for a local website with my colleague and linguist, aka word guru, Roly Sussex O.A.E.  But you know how these things happen.

A reference in the paper I was reading pointed to an article in Inside Higher Education. The link was about scholarly publication and the role of peer review, at least in part. Steve Kolowich, the author of the piece in IHE was describing some ideas of  Kathleen Fitzpatrick, director of scholarly communication at the Modern Language Association and a professor of media studies at Pomona College.  Her point was that the scholarly book is no longer the primary mode of communication in the digital age.  Yet it remains entrenched in the hallways of some disciplines as the only means by which one can jump through the tenure hoop.  She described the scholarly monograph not as dead, but as undead.

I realise I spend little time in the literary world. I don’t re-read Jane Austin or despite the recent anniversary I haven’t picked up The Pickwick Papers or A Tale of Two Cities (despite the New York Times poster that leveraged the opening lines to great effect many years ago – by the way, if anyone knows where one can get a reprint of that classic NYT advertising poster, let me know. I had one once and didn’t keep it. 😦  What got me intrigued was the reference to “Pride and Prejudice and Zombies: The Classic Regency Romance – Now with Ultraviolent Zombie Mayhem!”.  Huh?

Seth Grahame-Smith wrote a novel re-imagining the classic Pride and Prejudice but with zombies!  From the Amazon book description:

 a mysterious plague has fallen upon the quiet English village of Meryton—and the dead are returning to life! Feisty heroine Elizabeth Bennet is determined to wipe out the zombie menace, but she’s soon distracted by the arrival of the haughty and arrogant Mr. Darcy. What ensues is a delightful comedy of manners with plenty of civilized sparring between the two young lovers—and even more violent sparring on the blood-soaked battlefield. Can Elizabeth vanquish the spawn of Satan? And overcome the social prejudices of the class-conscious landed gentry? (

Ok. So much for the draft of our column.  Where’s my Kindle???

— pdl —

Read more:
Inside Higher Ed

Posted in Scholarly Communication | Tagged , , , | Leave a comment