AAAS Vies for the Title the “Darth Vadar of Publishing”

AAAS is vying for the crown of Lord Vader or Chief Evildoer in its approach to suppressing cost-effectiveness_open_access_journalsopen dissemination of scientific knowledge, even when that knowledge is paid for by tax payer money in the first place. They claim to support open access. They redefine it to be a pay for publishing charge (APC)  of $3,000 USD and that restricts the subsequent use of the information in the article preventing commercial reuses such as publication on some educational blogs, incorporation into educational material, as well the use of this information by small to medium enterprises. If you really meant open access, the way the rest of world defines it, you’ll have to pay a surcharge of an additional $1,000.  But it gets worse.

Faux Open Access  Journal “Science Advances” (perhaps better “Poor Researchers Restricted”)

A new faux open access journal Science Advances is being launched next year that will, get this, charge an additional US$1,500 above the fees listed previously to publish articles that are more than ten pages long. Wait…. this is a born digital publication with no paper distribution.  They’re charging $1,500 plus the $4,000 to publish an open access article longer that 10 pages. It is bits, right? Their argument is that the freely provided peer review process is more difficult with longer papers so they should charge more for the effort, seeing as how they are getting their reviews for nothing anyway and this is just pure profit – and who doesn’t like pure profit? They claim that the additional ‘editorial services’ justify this additional surcharge.

What about publishing your data along with your paper? Why would enough detail to replicate experiments be important since when we do we often don’t get the original results verified anyway….

The AAAS commitment to open access is worth examining. Besides redefining it so that it isn’t open and accessible at all, they published a widely criticised study claiming the peer review process in open access journal is suspect.  The criticisms were methodological. John Bohannon, a correspondent of Science, published the results of an experiment or sting operation in Science in an article entitled “Who’s Afraid of Peer Review?” to expose the problems of open access publishing. You can read the full text here  – or you could if you have access to the journal Science which is behind a paywall. Those of you from university or colleges with a subscriptions to Science can gain access to it through your library, assuming they’ve been to afford to keep the online subscription. The rest, well, you’ll have to my word for it.

It Never Was an Open Access Study

The problem is it wasn’t study of open access journals at all. There was no population of other journal types to compare it to. He sent a bogus article to a group of open access journals, over 300 of them, from a list on the site the Directory of Open Access Journals. Fair enough. But none were send to proprietary journals and thus never explicitly compares open access model to the subscription model. What he’s pointing out, as the Martin Eve notes in the Conversation (a good article, btw)  that Bohannon is highlighting problems with the peer review process. These are not new and they are a major concern. But they are not related to open access any more than they are related to subscription economic models.

The bottom line is Science and the AAAS is trying to redefine open access to make it compatible with their highly lucrative current subscription model. They are levying APCs of $5,500 per article to “make a report openly accessible.” They’ve published a flawed ‘study’ to justify their actions and put up a website to support their mistaken claims (http://scicomm.scimagdev.org/). They have hired a managing editor for their new ‘open access’ publication Science Advances who is openly critical of the open access movement. They have lobbied the UN to try and pressure the Special Rapporteur in the field of cultural rights at the United Nations, Farina Shaheed who is preparing a report on open access for the UN Human Rights Council, calling open access young (aka immature), experimental (aka risky) unable to demonstrate the benefits that to them clearly exist from traditional reader pays (aka subscription based) publishing models.

It’s time to recognise when a monopoly is trying to consolidate its position at the expense of the very people on whose work its prestige depends. Shame on AAAS.

Posted in open_scholarship | Tagged , , , | 1 Comment

ConnectedCourses

This fall, in North America, a new open course is starting on open learning, the meaning of connection and in what ways is it really possible to engage in distributed learning at a distance.  There have been disparaging remarks at the degree to which innovation & learning is really possible at the scale that massive open courses have achieved. What’s innovative in the pedagogy that characterises learning patterns for at scale that are heavily “designed” and in so doing make the ad hoc small group discussion difficult if not impossible? More bluntly, how is video recorded lecturing to 100k learners an improvement in pedagogical practice?  The question raised is does scale prevent good pedagogical practice?

This is one of many questions I hope to explore with others in the upcoming Connected Courses learning event staring this September.

Posted in ConnectedLearn | Tagged | 3 Comments

The Post Digital Age

Phillip Long, Ph.D.
Institute for Teaching and Learning Innovation, UQx Project
The University of Queensland
10 August 2014
Image by  amattox mattox, Flickr.com,  Some rights reserved (cc by nc)

Image by amattox mattox, Flickr.com,
Some rights reserved (cc by nc)

It’s about time we got there. We started with the model of learners working either independently or in a close relationship with mentors, instructors, or teachers to incorporate from them knowledge, built upon scarce data that only their teachers knew. Knowledge and its power were a direct correlate of what you could remember and recall when the time and place for them was present.

But the growth of facts began to challenge even the most capacious human minds. We took to recording them, laboriously by scribes, onto an external storage medium. The effort made the scarcity costly, adding economic value to the mix. Among the knowledge elite this precious external storage environment was prized and guarded.  Practitioners of the passage of knowledge were seen at times silently mouthing words and sentences in a mystical and to some very frightening practice of ‘quiet reading’. They translated the coded representations of knowledge on the fly as their eyes danced across the storage medium bringing them to express things that those around them knew to be beyond their experience – and scaring the unlearned by the “power” in these new devices…. books.

The revolution of the printing press democratised access to information. It was no longer a matter of the rich and powerful to own knowledge. The transition took several hundred years, but it laid a part of the foundation for the explosion of knowledge that characterised the Renaissance. It also changed the way we think. What once was knowledge by virtue of memory and its recall was now possible to store outside the little grey cells in your cranium. We needed an indexing system to be able to track all of that externally stored information and mechanisms to use it for efficient retrieval.

Various mechanisms emerged that used descriptions of the location of the physical objects (the Persian city of Shiraz’s library, 10th century (1)), the location itself coded by numbers (Library at Amiens Cathedral in France (2)) or Thomas Hyde’s Incunabulum, a printed catalogue of the books in the Bodleian Library, Oxford University.  All of these were attempts to organise and make more accessible to humans the increasingly vast body of information accumulating in the world of knowledge creation.

We have always, as tool making creatures, used our ability to build things to improve and existence. Initially this was focused on survival, but as we became more capable it quickly spread to other aspects of making life easier, better and more fulfilling. To organise our knowledge we built repositories in the form of collections in libraries, and indexed them initially more arbitrarily, and later through a classification schema (e.g., the Dewey Decimal System).

jacquard_head

The Jacquard head for dobby looms.

The advent of representing information more abstractly, in terms of binary coding of human readable characters launched the digital revolution. The initial physical manifestations derived from what was cutting edge technology of the period – gears, levers and pulleys gave way to re-appropriation of the loom for separating digitally represented holes as ones and their absence as zeros. The “Jacquard head” for dobby looms provided the in

sight between pattern in the abstract and the Jaquardian weaving that resulted. His key idea was the use of hole punched cards to represent a sequence of operations, leading to the Analytical engine of Charles Babbage and later the card tabulating machine of Herman Hollerith to perform the 1890 US Census.

Tying all this together is the use of technology to augment the human intellect. Fast forward to the end of World War II and the same concern for the proliferation of information and ways to find and use it rather than continue a cycle of rediscovery was expressed by Vannevar Bush in the classic “How We May Think” article printed in Vanity Fair, July 1st, 1945 (3). In it he proposed, based on the state of the art of technology of his day, the Memex, a machine to record knowledge by mimicking the human search process through what he termed Associative Trails. He wanted to record not just the artefacts but the way in which humans thought through the steps that led to the formation of those artefacts. Further he wanted to make these shareable so that others could see not just the result, but the process by which that result was derived.

It took another 23 years before the technology the day could at least attempt a physical implementation of this idea. It was presented to the world in a breathtaking live demonstration at the Moscone Center in San Francisco, California by Doug Engelbart in the “Mother of All Demos” (4).  The demo was an attempt to show, rather than talk about, something Engelbart wrote six years before in the landmark paper Augmenting the Human Intellect: A Conceptual Framework (5). In one 100 min ‘show and tell’, Doug demonstrated the introduction of the computer mouse, video conferencing, teleconferencing, hypertext, word processing, hypermedia, object addressing and dynamic file linking, bootstrapping, and a collaborative real-time editor. Our world was forever changed.

Today we routinely offload memory into silicon. Some refer to this as the dumbing down of our intellect by Google (6) but we’ve been doing it for centuries – we just do it better and more efficiently today. And while it appears we’re mired in our devices, walking heads down into street signs as we text away, or sitting at the dinner table with friends when we’re ‘friending’ people who aren’t present through the devices formally known as phones, we are moving past that era.

What marks this shift? The post digital age is like prior radical transitions – it’s marked by the fact we no longer recognise it as different. Think back to when your parents had an icebox. They replenished it with block ice at least daily. And then something happened. Refrigeration. And in less than a generation we went from astonishment at this miracle, to forgetting the world was different in the time before it.

kid_zoom_gesture

Zoom gesture comes naturally cc by nc sa Alec Couros, https://www.flickr.com/photos/courosa/

Look at children playing today with their parents smartphones, or perhaps their own tablet computers. When they walk up to pictures now they try naturally to manipulate them with the ubiquitous thumb and forefinger spread to zoom the image. We walk around with digital sensors measuring our gate, altitude, and velocity and glance at our ‘phones’ (we need a new term for this) to see the dashboard of our activity.

More importantly, we are starting to see and think in ways before we couldn’t before because our devices are shaping what we conceive as questions. In 2011, we began to make ‘movies, by directly recording the impulses from the voxels in our brain to reconstruct the imagery recalled from the memory of movies we have seen (7). We are on the verge of communicating rich media from neural storage to the sensors that pre-process it in others. It’s not long before we have the capability to transfer these memories passing their biological encoding. Will these be ‘memories’ at all without this step? Will we perceive them the same as those created by our own neural infrastructure? We don’t know yet, but we soon will.

As the embedding of digitally enabled devices extends the concept of the internet of things (8) to interconnection of objects in a network with ourselves we silently enter the post-digital age. As David Foster Wallace wrote in 2008, “the most obvious, ubiquitous, important realities are often the ones that are the hardest to see and talk about” (9). Fish don’t see water, but we must.   Welcome to the post digital age.

Citations

(1) http://www.narcis.nl/

(2) Joachim, Martin D., Ed. Historical Aspects of Cataloging and Classification, Volume 2 The Haworth Information Press, Binghamton, NY, 2003, p. 460.

(3) http://www.theatlantic.com/magazine/archive/1945/07/as-we-may-think/303881/

(4) https://www.youtube.com/watch?v=yJDv-zdhzMY

(5) http://www.dougengelbart.org/pubs/augment-3906.html

(6) http://theatln.tc/1B55rQO

(7) Nishimoto, et. al., (2011), Reconstructing visual experiences from brain activity evoked by natural movies, Current Biology, 21(19):1641-1646.

(8) Ashton, Kevin (22 June 2009). “That ‘Internet of Things’ Thing, in the real world things matter more than ideas”. RFID Journal.

(9) http://www.theguardian.com/books/2008/sep/20/fiction

Posted in innovation, post-digital | Tagged , , , | 1 Comment

Radically Transparent Research – or Why Publish Before Peer Review?

I was reading the Gardner Writes, as I follow my colleague and friend’s thoughts from the other side of the globe with great interest and anticipation of the thinking he forces me to do. He’s been on a multipart series kick lately, probably to break up a long piece of discursive writing that formed the spine of a report he wrote for his home institution, and thereby make it more accessible and easier to digest.

He made an aside, referencing a blog post by Dave WIner about why he (Winer) writes in public. This resonated particularly at this moment for me as we’re in the midst of writing a grant proposal for a funding body in Australia, one thrust of which is how to break the cycle of the traditional linear research methodology. In engineering education, the domain for which this proposal is targeted, as in most engineering and scientific disciplines, the process can be described as:

Conceive — Design — Implement — Operate — Analyze — Disseminate

(That’s a modification of the engineering methodology some of you may be familiar with from the CDIO consortium)

The point of this is that you follow your experimental protocol, conceptualising the hypotheses derived from a theoretical framework you think informs the work, design the experiment, its methodology to collect data that might inform and or refute it (them), translate the experimental methodology into something that might actually allow you to do the work, then conduct the experiment to collect the data and try out the methods you think will inform you about the questions you posed, analyze it when the experiment is over to see what came out, and then, perhaps with the others involved, you write it up to share it with your colleagues. This is the journal publication process that can take from as little as three to four months, to upwards of 18 months to 2 years. All up the cycle that you’ve just engaged in is a measured in years.

Since all of your colleagues in the community are likewise going through this process to explore their own hypotheses the asynchronous and overlapping time lines for this work naturally leads to the sharing of results occurring through the stages of the cycle. In fact it’s more likely than not that some of this new knowledge is likely to have value to your work, indeed to potentially influence it, and, if you weren’t fixed to the methodology you’re following so that you can get results that are experimentally sound, you’d likely have changed something along the way to leverage the knowledge that you’ve just read and discussed with the authors via email or at that conference you both attend annually.

All up the two year period that it takes to devise, conduct and report on your work IS as much of a problem as any aspect of the work itself. If the point of all of this is to learn and improve the educational process, the chances of doing that meaningfully in our life times is low. After all, many of the outcomes from this cycle aren’t going to be particularly informative – two years to report that you really didn’t find anything significant this time around is a long time in work life of an academic, and even longer in the educational trajectories of our students.

Dave Winer described why he writes his blog saying,

I write to express myself, and to learn. Writing is a form of processing my ideas. When I tell a story verbally a few times, I’m ready to write it. After writing it, I understand the subject even better.

The connection to why he writes and why we publish is similar. But the problem is the timescale. Blogging is relatively rapid. The short cycle time helps us share our thoughts and forces us to consider the things we’re thinking about. And, it invites others to see, consider, and respond to what we’re thinking about.

I write to give people something to react to. So you think the iPhone was a winner from Day One. Great. Tell me why. Maybe I’ll change my mind. It’s happened more than once that a commenter here showed me another way of thinking about something and I eventually came around to their point of view. And even if I don’t change my mind, it’s helpful to understand how another person, given the same set of facts, can arrive at a different conclusion

That’s the discursive dialogue that we’re missing in science, but which has been developed and understood in the open source community for some time. That’s the community from which Winer comes, and the transparent sharing of thoughts and ideas is part of the culture, the method of open source development. And that’s what’s missing from the higher ed learning research community. Our livelihoods are tied to the articles we produce, to the impact factor of the journals in which they are published, the attribution of the ideas to ourselves as the original authors. We sacrifice the enormous potential the community could give us by holding fast to the belief that if we share it openly we’ll see our intellectual contribution devalued, lost, or worse, stolen, by someone else claiming the idea as their own. We worry about our ideas being ‘stolen’ while in reality the fact that we put them out there in the first place establishes our provenance.

Once up on a time, when the mechanism of production and sharing of ideas took huge amounts of capital, significant effort to create the production value we sought in quality work, and complex, costly, and time consuming mechanisms to distribute the work to our colleagues in via the journals trucked/mailed/shipped to libraries in higher ed institutions around the world, the concern about provenance of an idea was more meaningful. Recall the Darwin/Wallace conundrum that surrounded the first published comprehensive expression of the idea of evolution by natural selection to the Royal Society in London. (Not familiar with this extraordinary coincidence of paradigm shifting ideas co-occuring? It’s a fascinating story, and one that owes a tremendous amount to Charles Lyell. Darwin was resigned to be ‘forestalled’ in getting the idea of natural selection as the driving force that works on natural variation (mutation) out to the world as his work. The process by which this was addressed, the compromise that emerged and the thoughtful intervention and guidance of Lyell is a lesson in ethical conduct, friendship and skillful political savvy).

The grant we’re writing is in part an attempt to demonstrate that there is another, more promising way to conduct this enterprise. The open notebook science movement has been around for some years – we are NOT claiming novelty in this. We are simply trying to appropriate the methodology and apply it to the research on learning design in engineering education. In this we’re adapting the work done by others, the most recent of which I’ve been reading is from the blog of Mel Chua. Here she writes

Radical transparency refers to the cultural practices used by healthy open communities (Free/Libre and open source software, open content, and open hardware projects) to expose their work in as close to realtime as possible and in a way that makes it possible for others to freely and non-destructively experiment with it.

From Mel Chua: Hacker. Writer. Researcher. Teacher. Human jumper cable.

We have to take the collective wisdom of the community and carefully apply it more rapidly to improve learning design for higher education students. We don’t want to put at risk the students in our courses, as is often raised as a “show stopper” concern by those who think that unless there is incontrovertible evidence of improvement in teaching approaches that are different, it’s better to stay with what works. But does it? That is, does it work? How do you know? And is there really the risk being asserted? As a colleague and co-writer of this grant once said in a panel discussion with me about the change in his teaching using the ‘flipped classroom’ approach,

Students won’t let you ‘fail’. They will raise concerns early and loudly is something isn’t going well. Any teacher worth being front of the room will respond and change what they are doing to avoid the catastrophe. Things may not go as you planned, but they won’t end up damaging the students in the class because they, and a good teacher, won’t let that happen.

So where is the real risk? It’s in not being open and transparent to learn together.

–pdl–

Posted in Open_Research | Tagged , , , | Leave a comment

A literary aside

I was reading material about the role of certification played by higher education

Zombie literature?
Some rights reserved by e_monk http://www.flickr.com/photos/e_monk/6495883741/sizes/l/in/photostream/

institutions.  It’s part of the background that I’m doing for a piece on badging for a local website with my colleague and linguist, aka word guru, Roly Sussex O.A.E.  But you know how these things happen.

A reference in the paper I was reading pointed to an article in Inside Higher Education. The link was about scholarly publication and the role of peer review, at least in part. Steve Kolowich, the author of the piece in IHE was describing some ideas of  Kathleen Fitzpatrick, director of scholarly communication at the Modern Language Association and a professor of media studies at Pomona College.  Her point was that the scholarly book is no longer the primary mode of communication in the digital age.  Yet it remains entrenched in the hallways of some disciplines as the only means by which one can jump through the tenure hoop.  She described the scholarly monograph not as dead, but as undead.

I realise I spend little time in the literary world. I don’t re-read Jane Austin or despite the recent anniversary I haven’t picked up The Pickwick Papers or A Tale of Two Cities (despite the New York Times poster that leveraged the opening lines to great effect many years ago – by the way, if anyone knows where one can get a reprint of that classic NYT advertising poster, let me know. I had one once and didn’t keep it. 😦  What got me intrigued was the reference to “Pride and Prejudice and Zombies: The Classic Regency Romance – Now with Ultraviolent Zombie Mayhem!”.  Huh?

Seth Grahame-Smith wrote a novel re-imagining the classic Pride and Prejudice but with zombies!  From the Amazon book description:

 a mysterious plague has fallen upon the quiet English village of Meryton—and the dead are returning to life! Feisty heroine Elizabeth Bennet is determined to wipe out the zombie menace, but she’s soon distracted by the arrival of the haughty and arrogant Mr. Darcy. What ensues is a delightful comedy of manners with plenty of civilized sparring between the two young lovers—and even more violent sparring on the blood-soaked battlefield. Can Elizabeth vanquish the spawn of Satan? And overcome the social prejudices of the class-conscious landed gentry? (Amazon.com)

Ok. So much for the draft of our column.  Where’s my Kindle???

— pdl —

Read more: http://www.insidehighered.com/news/2011/09/30/planned_obsolescence_by_kathleen_fitzpatrick_proposes_alternatives_to_outmoded_academic_journals#ixzz1pLgAhMjA
Inside Higher Ed

Posted in Scholarly Communication | Tagged , , , | Leave a comment

What is Scholarship?

What is scholarship? That question arose at a discussion on Transforming Education . A nice reference was offered that seems worth sharing . Robert diamond wrote the following description of scholarship. It’s short and sweet.

Recognizing Faculty Work, by Robert Diamond and Bronwyn Adam (1993), identifies six characteristics that typify scholarly work:

  • The activity requires a high level of discipline expertise.
    The activity breaks new ground or is innovative.
    The activity can be replicated and elaborated.
    The work and its results can be documented.
    The work and its results can be peer reviewed.
    The activity has significance or impact.
  • This was summarized by Diamond in the The National Academy for Academic Leadership.

    The activity or work requires a high level of discipline-related expertise.
    The activity or work is conducted in a scholarly manner with:
    · Clear goals
    · Adequate preparation
    · Appropriate methodology
    The activity or work and its results are appropriately documented and disseminated. This reporting should include a reflective component that addresses the significance of the work, the process that was followed, and the outcomes of the research, inquiry, or activity.
    The activity or work has significance beyond the individual context. It:
    · Breaks new ground
    · Can be replicated or elaborated.
    The activity or work, both process and product or results, is reviewed and judged to be meritorious and significant by a panel of the candidate’s peers.

    The interesting thing is that while their is general acknowledgement of this description as reasonable and thoughtful to characterize scholarship, the remarkable thing is our metrics and local incentives often fail to clearly reinforce it. This is particularly true when this applied to one of the most important roles of the university, learning.

    -PDL-

    Posted in Uncategorized | Tagged , , , | Leave a comment

    LAM/T, Process/Repetoire Hierarchies and Sense to Express them As a Story

    Like Gardner, I find every time I read Augmenting the Human Intellect I find something new, something unexpected, that Engelbart draws out of me in reaction to the depth of  his thinking.  This time was no different.

    Resonating with me now is what I recognise as what we call today compound document authoring – but that’s a lousy description of it.  The essence of the idea is the ability to connect disparate thoughts (the combination of the repertoire hierarchy with associative links) and surface them for not only us to see in our own thinking (for metacognitive recognition) but for us to communicate to others.

    Engelbart took a key insight of V. Bush, the associative trails, and with benefit of 18 years of technology development and thinking, most critically with the recognition of the coming of graphical user interfaces, transformed these into the description that “Joe” gave us wherein ideas where connected graphically in fragments, or sentences or paragraphs. I don’t recall if he actually talked/wrote about naming the relationships themselves but today we would.

    This immediately conjured up the work going on in compound document authoring where ideas are drawn not just from oneself, but from disparate documents and connected with directed graphs, each association described in the metadata that identifies the meaning/intent of the relationship.  Interestingly Jane Hunter’s eResearch group has built these kinds of tools to remix or connect thoughts across parts of a document or across different documents.  The AusLit LORE tool is prime example of that.

    We’re planning collaborating with Jane Hunter and her eResearch Lab to develop this idea in the context of metacognitive portfolios.

    Thank you Doug. It’s always inspiring to read the depth of your thinking even 50 years from the writing of Augmenting the Intellect. If ever a man was ahead of his time….

    pdl

    Posted in Uncategorized | 1 Comment

    Innovation – Don Quixote’s Army

    From several directions I’ve encountered a set of messages, converging from Twitter, email, and conversation all focused on some aspect of innovation. The general theme is “why it is so hard?”.   A succinct summary to that point was made recently by Terry Cutler:

    Innovation is  “a deliberate and thoughtful act of defiance against the status quo in order to make a difference.”

    This was complemented by a colleague writing in a discussion list about the ambivalence of educators to adopt technologies in their teaching.  He reminded us that Neal Postman had something to say of relevance here,

    Postman used to say that it’s the job of university professors to teach AGAINST THE CULTURE.  (Ed Lamoureux)

    Innovation pushes you outside your comfort zone. Those who are uneasy with ambiguity or uncertainty have particular trouble with this sensation. The problem is that creativity or at least the talk about it has become a positive cultural norm – that is, not the act itself but the rhetoric around it. We espouse the value of creativity, but it makes us uncomfortable so in the same breath that we advocate for it, when it is expressed in new ways to do something that touches us, we revert to caution and express concern for risks and the negative impacts that might be caused. Not that they will, or that  there really is danger or damage imminently on the horizon if the innovation is pursued, just that it might be horrible because there is a potential for things to go wrong, and that needs to responsibly be avoided.

    Jennifer Mueller, Shimul Melwan, and Jack A. Goncal, authors of the paper “The Bias Against Creativity: Why People Desire But Reject Creative Ideas” in DigitalCommons @ IRL, wrote

    …if people hold an implicit bias against creativity, then we cannot assume that organisations, institutions or even scientific endeavours will desire and recognise creative ideas even when they explicitly state they want them.

    All of this was summarised nicely in the blog by Stephen Machette, writing in of all places the blogs of the Australian newspaper (a publication I no longer read regularly because of its unrelenting bias against the reality of climate change and human contributions to it), entitled “No One Likes a Smart Innovator”.  The sad reality is that it is very true to the mark.

    In universities everywhere I suspect, but certainly where I have current first-hand knowledge, the reaction to a radical course redesign that is supported enthusiastically and fully by the faculty can run into deep opposition among senior university leadership. It is too much change, too fast, putting too many students at risk, or so the response is often articulated. Why, one might be asked, couldn’t you just do one or two tweaks to the course and teach everything else the same? But that’s the problem with change – you do it or you don’t. It’s hard to innovate half way.

    Yes there is risk. Yes it could fail to work out as intended. The risk is minimised because the faculty teaching and proposing innovations are committed and responsible. They will do their utmost to avoid diaster if they really believe that’s where they’re heading. But they’re equally passionate about the potential upside for the course redesign and it’s benefit to students. That in itself will mitigate even those changes that may when tried turn out to be less than we hope.

    I’m reminded of a wonderful saying attributed to “Doc” Edgerton, the father of high speed photography among other things.  He said

    That’s the nature of research–you don’t know what in hell you’re doing.

    That’s true though “Doc” was exaggerating to make his point. Good research is in fact based on deep inquiry that sits on a sound, but unfinished framework. The best research re-shapes the framework itself.  Recent findings from CERN have experimentally rejected much of  ‘string theory’,  with experimental physicists finally having an instrument to collect data that allows them to now tell their theoretical colleagues , think of something else because the predictions from string theory aren’t substantiated by experimental data.

    We face times that need innovation more than ever. We have to embrace the ‘mindfulness’ of openness, questioning what we ‘know’ and what authority says must be.  Without that we’re in for a long slow descent.

    — pdl–

    Posted in innovation | Leave a comment

    Productive Failure

    Education today is tremendously risk averse. There’s a good reason for that. The institution of the school or tertiary education is trusted with providing education and training to its most precious commodity, the children of its community. The last thing one feels comfortable about is trying out something that, heaven forfend, might fail. We’d be abdicating our duty of care.  Or would we?

    One of the most common and salient characteristics of human endeavor is failure. It’s basic to our learning and intrinsic to our neurological development, our motor skills, and basic interactions with the world. Yet, in the spaces of formal education it has become more of a liability to avoid than a process to encourage.

    There are, of course, kinds and degrees of failure. No one is suggesting that we watch lovingly as our kids, fascinated by the flame on the stove top, reach out and burn their fingers to follow up and say, “Fire is hot Sally.  Try to avoid doing that in the future.” Whereas we can and do push them on their two-wheeler after a period of time on training wheels and run alongside, steadying where we can, but ultimately trusting them to get the hang of balancing themselves on two spinning gyroscopes, but expecting that at least in these early stages, there will be a crash or two and bandaids need to be at the ready.

    In education, failure is a part of trying. It happens even with the best of preparation when one is reaching beyond oneself. And it’s good. Whether in the sciences, humanities or the arts it is key to learning. We strive there for what Rocco Landesman, Chairman of the National Endowment for the Arts (US) calls “productive failure”.  He spoke recently at a public high school graduation ceremony in the US to the assembled eager graduates on this topic.

    Where arts education has been nurtured….however, innovation born of struggle results..

    “Productive failure” — something the arts are particularly suited to teach, he said — is useful in all parts of life, from medical research to the business world. If people are doing something they enjoy, he said, failing can inspire them to try harder and produce creative alternatives.

    He told the graduates to think about the times they couldn’t make a scene work, or couldn’t complete a dance combination, or couldn’t get the light right for a painting. As artists, he said, they understood the role of luck and of perseverance through failure better than almost anyone.

    “You didn’t quit, you tried again, you tried harder, and you tried something new — it was productive failure,” Mr. Landesman said. “Those of you who failed often, succeeded sooner.”

    @GardnerCampbell brought this article to my attention in a Tweet (thanks Prof. C) and it reminded me of how much we’ve tried to engineer learning situations to avoid failure. An early conversation with colleagues who specialise in academic staff support (aka faculty support) in Australia introduced me to how profound cultural differences can be in seemingly similar environments. Here, I was told, failure is not to be talked about in that language. It damages the persons sense of self-worth. We need to protect students from the long-lasting negative effects such language has on their developing personalities.  I was a relatively recent arrival but I was nevertheless gobsmacked.  Failure, I said, happens to all of us.  In my world the goal is to fail early and often as that’s the only way I really learn.

    I’m reminded of sports analogies. There is a coach’s common encouragement to tell their athletes, if you aren’t failing you aren’t trying hard enough. Failure conditions to be avoided are those that are life-threatening.  Those that are ego-bruising are worth striving for.  What better place for learners to try and fail but in the relatively ‘safe’ environments of schools, where there is understanding, support, encouragement, and shared experience?

    — pdl–

    Posted in Uncategorized | 2 Comments

    What makes for success & fosters innovation?

    The joy of exploring – what work should be.

    With the announcement of Steve Jobs stepping down from the role of Apple’s CEO there have been a flood of responses from various communities like the media, business pundits, and the general public. The majority have recognized that whatever you might think about particular aspects of his approach to leading Apple & Pixar, there is general consensus that he possesses a combination of leadership & vision that are rare in any era. For example, I don’t much like the closed nature of the Apple universe, but appreciate why he took that path. But among all the posts, thoughtful, blathering, critical, or laudatory, the short piece by Carmine Gallo in Forbes was among the most succinct.

    Gallo titles the post “What makes Steve “Steve”?”  Gallo is no stranger to the topic of Steve Jobs, having written two books about him in the recent past. However, in this post from Forbes he zeros in on the seven things that distinguish creative innovations leading to success, an which typify what makes Steve “Steve”.  Gallo’s post is worth reading for the elaboration on each point but I’ll provide the short list below:

    1. Do what you love
    2. Put a dent in the universe
    3. Connect things to spark your creativity
    4. Say no to 1,000 things
    5. Create insanely different experiences
    6. Master the message
    7. Sell dreams, not products

    Do what you love – The first is not just arbitrary in order of priority,  it is the priority.  Just a few days ago in Lifehacker  David Fuhriman wrote a  post entitled “If You Wouldn’t Do Your Job For Free, Then Quit“.  His brother had graduated from Yale in 2009 and during the graduation proceedings a list of  ‘advice’, you know, the platitudes that seem to always accompany graduation ceremonies everywhere, were given to the soon to become lawyers, writers, and the highly educated unemployed.

    • An hour of sleep before midnight is worth two, and an hour of work before noon is worth two.
    • Always pick your kids up from school. That’s when they want to talk.
    • Never let your skill exceed your virtue.
    • Never take less than two weeks off when you have a child or for your honeymoon. Don’t let them talk you down.
    • When you mess up, admit it frankly and quickly, and move on.
    • Always do your very best in your job, but if you don’t like what you’re doing enough that you would do it for free, quit.

    It was the last of these that Fuhriman reacted to most, and led him on a quest to change jobs (he was doing accounting) through several intermediate “test” careers to his current career (small business consulting), where he seems to have found himself and his niche.  This last piece of advice,  do what you’d do whether paid or not, is all about finding your passion because there in lies the intrinsic motivation that you need to put in the ‘10,000 hours‘ it takes to have the potential to achieve great things.

    Put a dent in the universe  – This is pretty obvious. Do what matters.  If what you’re doing doesn’t mean a hill of beans in this world why in would you waste the precious time you have doing it? What matters doesn’t mean solving world peace or creating the successor to the iPod. Being there for your kids, or making a difference in the lives of those around you in some small way each day are things that matter, too. The point is changing the world is good – it’s just that the world is a complex system made up on things big and small.  The scale part isn’t the thing, the ‘mattering’ part is.

    Connect things to spark your creativity – This is one of the areas that makes higher education so powerful, and simultaneously one of the attributes about it most in jeopardy. The horizontal thinking that comes from intellectual exploration is often among the first things sacrificed in the pursuit of learning outcomes. We often don’t know or can’t tell in advance what it is that will make the difference in understanding a subject, or how a set of subjects will bring understanding to us.  Yet it’s precisely these lateral elements that come under the microscope when asked to demonstrate their return on investment. It’s the ‘liberal’ in a liberal arts education. Given the political polarisation today perhaps this should be rebadged “lateral arts” education.  It is what Gallo points out Jobs was masterful at, creating things by joining the dots between ideas in different fields.  It’s the truth behind the aphorism research is most productive at the intersection of disciplines.

    Say no to 1,000 things – This one is where I typically fail miserably. Yet I’ve heard it in so many contexts that there is no doubt in my mind that it’s true. Focus, focus, focus. To do that means saying “no”.  A colleague of mine at the Sloan School at MIT told me about the lessons he learned in the first few years of being a faculty member there. When asked what was different about his new position at Sloan, he thought a moment and said, ” Each week someone will pop their head in my office, describe an insanely interesting research project, and ask if I’d like to participate in it.  Week in and week out a face will appear and point out a line of research or a grant opportunity that sounds like it’s just impossible to pass up. It was really hard at first because I never had experiences like this before. But here it’s different. Here the secret is saying ‘no’ not saying ‘yes’.  The good news, ” he concluded “is that the really terrific thing that you passed up two months ago and turned out to be spectacularly successful, doesn’t really make you feel as bad because you know something’s going to come along as good or better in the weeks and months ahead.” It’s all about saying ‘no’ to most things so that the things you say ‘yes’ to you can put your whole being into.

    You can read the rest about what makes Steve “Steve” in Gallo’s post.  It was the first set that really rang true for me and I think resonate across discipline are professional boundaries. But it’s the first that matters most. Being able to get up each and every day and eagerly approach ‘work’ with anticipation and gratitude, that’s priceless.

    — pdl —

    Posted in innovation | Tagged , , , , | Leave a comment