Moving mountains: the reforms that would push academia to new heights

Until the pandemic forced teaching to go online almost overnight, universities were widely considered impervious to major change. But if one age-old practice can be flipped on its head, why not others? We ask six academics where they would direct their efforts first 

February 18, 2021
Boy jumping
Source: Alamy

Clarify the scientific future

The email came in at 9.23pm on a Sunday. “By the end of this week we will be delivering teaching fully remotely,” said senior management. “The VC has also decided that this will be true for the whole university.” It was 15 March, a week before the UK entered its first national lockdown.

Propelled by panic and coffee, colleagues immediately mobilised to convert hundreds of hours of lectures and practicals for online teaching. Within days, the first guidance was circulated on how to best use Zoom and Teams, how to properly record lectures, how to maximise recording quality and student engagement. As management put it a few days later: “In these circumstances wisdom states that we should, to the best of our abilities, face the music and dance.”

By 20 March, the schools had shut, and now many colleagues faced not only reshaping their teaching programmes but also home-schooling and childcare in parallel. Stress and fatigue levels rose even higher, and for many of us our kids became the regular support acts in our lectures and Zoom meetings.

Meanwhile, the gravity of the situation called on those of us with additional means to help fight the pandemic. For me, that involved a publishing initiative: the morning after the v-c’s announcement, we launched a special scheme at the Royal Society to fast-track the peer review of Registered Reports submissions related to Covid-19. Nearly a thousand scientists and several major journals would join this rapid review network, committing to assessing Covid-19 research manuscripts within 48 hours.

Academia doesn’t change easily, and many of my colleagues were sceptical that we would be able to adapt so quickly to this brave new world of online instruction – particularly given that we were emerging in the UK from a period of scarring industrial action marked by the intransigence of institutional leaders regarding long-overdue reforms. But what happened in the days and weeks after 15 March proved just how adaptable and effective we can be when we rally around a common goal. Some institutions handled lockdown better than others, but every lecturer-in-the-trenches I know did everything in their power to support their students and the public during this crisis.

Nearly a year later, and again in lockdown, I believe that one of the few silver linings of 2020 is that it tested the limits of our versatility and revealed that we are stronger and more capable than we realised. And I believe we can harness this confidence and agility to solve other problems.

Among the greatest and most vital of these challenges is improving the way we do research. Over the past 10 years, we have seen alarming signs that some areas of science – including many in the social and biomedical sciences – make claims that cannot be verified or repeated, largely because too much of the research process is weak, biased or hidden from public scrutiny. Reforming the culture of science to make it more honest, transparent and reliable requires the coordinated actions of researchers, journals, funders, governments and – last, but not least – universities.

But are universities up to the task? Before the pandemic, I might have been tempted to say no – or at least to caution that a long, constant war against deep bureaucratic inertia lay ahead. Compared with funders and journals, universities had been dragging their heels on embracing concrete reforms such as the open sharing of data and the preregistration of research protocols; scientists had for years lamented such barriers, which they attributed to “the system”. But having seen what we can achieve in the face of Covid-19, it is clearer than ever that “the system” is nothing more than a set of choices we make as a community. The growing list of universities joining the UK Reproducibility Network shows that our community is finally taking this issue with the seriousness it deserves, even in the midst of the pandemic.

Other major problems now seem equally solvable. Changing our working culture to combat climate change by conducting more conferences online? Doable if we invest in superior videoconferencing technology, learning from our experiences in the pandemic but moving beyond the limited tools we have relied on so far. Solving the gender pay gap in our universities? Doable if, once and for all, we commit to equitable hiring and promotion practices. Eliminating or at least dramatically streamlining the wasteful and exhausting bureaucracy of the research excellence framework and its cousins for teaching and knowledge exchange? Very doable once we see these “assessment exercises” for what they are: the indulgences of a flabby, pre-pandemic bureaucratism that other countries do perfectly well without.

All of this can be achieved if we coordinate at grassroots level the way we did in 2020. If academics can contribute to online teaching within a matter of days – not to mention making major contributions to developing vaccines within a matter of months – we should never again doubt our ability to change.

Chris Chambers is a professor of cognitive neuroscience in the School of Psychology at Cardiff University.


 

Man dancing
Source: 
Getty

Supervise the supervisors

Covid-19 has accelerated a commitment to vacuous words and phrases. Agile. Co-design. Resilient. Robust. Socialise the document. Whenever these words are summoned, a subcommittee follows with a long agenda and few outcomes beyond mitigating the spotlight depravation of managers who had a new haircut for the Zoom call.

But amid all this empty performativity – add to it Human Resources’ injunctions to just be excellent to each other, dude – there have been some unexpected advances in higher education. A prime example is doctoral supervision.

It remains one of the great ironies of higher education that while most of us in the sector are employed to educate, any professional learning offered to improve our practice leaves us as repulsed and as lost as Jack Nicholson at a women’s studies conference. Hence, we typically supervise PhD students as we were supervised (or, worse, how we think we were supervised). One supervisor/adviser’s “experience” is regarded as more important than any amount of peer-reviewed research into doctoral education. This is the only area of university life where a data point of one is still valued.

Moreover, this attitude blocks change. When many of us completed our PhDs, old white blokes in dodgy jumpers or polyester suits that are a fire hazard to surrounding postcodes taught younger white blokes to be academics just like them. Women were an inconvenience because they would “waste” their education and have children. The working class were underprepared for university via their supposedly substandard schooling. And students of colour offered uncomfortable reminders of the profound past inequalities that live in the present.

These students were inconvenient because they proved that homology was not a functional teaching and learning strategy. But, now, we have the greatest diversity of doctoral candidates in the history of our universities – many of whom will go on to careers beyond the academy. The Black Lives Matter movement, plus greater awareness of dis/ability and the shocking scale of sexual harassment and assault in our universities, means that those old assumptions about academic life are no longer acceptable. In truth, they were never acceptable. Those who still peddle them are like a drunk uncle at a wedding, dancing to the Village People’s YMCA and thinking their moves are generalisable to daily life.

To be fair, research cultures were beginning to change even before Covid-19. Research integrity policies around the world, for instance, were starting to critique the idea that having a cup of coffee in the vicinity of junior researchers preparing a paper is enough to merit authorship. Deep thinking about modifications to methods was also being undertaken before the pandemic, as vulnerable populations can no longer be studied as docile communities of subjects ready to jump into the metaphorical Petri dish. But Covid-19 has accelerated reform, particularly at the supervisory level. Unpopular face-to-face supervisory training has been replaced by regular, scheduled Zoom meetings.

The pandemic has stimulated online efforts in professional development, too. At Flinders University, we deployed a podcast series called Steps (in homage to the reformed pop band, obviously) to create small, bespoke and customised sessions. Academics requested topics, and the capacity to slot professional development around other responsibilities proved popular. Topics varied from industry engagements to the higher doctorate, the posthumous thesis and PhDs by prior publication.

These short sessions provided a slice of learning – an intervention – for all supervisors. Sessions for early career researchers about supervisory relationships and communication systems were tailored to their needs. For the senior scholars – the most reticent to participate – the convenience (nearly) overcame their bravado. We could still acquiesce to their South Park-style demands to “respect my authoritay” while providing quirky topics and ideas beyond their individual experience.

The gift of podcasts for professional development – the ability to whisper in supervisors’ ears while they are walking, exercising or cleaning the house – has proved the gift of Covid. That metaphoric drunk uncle dancing to the Village People is learning some new moves. He is not Beyoncé, but he is starting to see how he could be through learning some new shapes of learning and thinking. We have momentum for change.

Tara Brabazon is dean of graduate research and professor of cultural studies at Flinders University. Her new book, The Creative PhD, co-written with Tiffany Lyndall-Knight and Natalie Hills, is published by Emerald Publishing.


Man surrounded by clocks
Source: 
Getty

Meeting deadlines

Recently, I experienced a small miracle. I logged on, via Zoom, to a committee meeting I had to attend. About 12 minutes later, the meeting was done and I turned to my next task.

Even including a few minutes of preparation – find the email with log-in information, prepare notes, a swipe of lipstick – the entire event took less than 15 minutes out of my day. When I told colleagues about this, they suggested it might be a good idea to buy a lottery ticket as my star was clearly in the ascendant.

I have had short meetings at my university before, but most tend to be quite a bit longer. In the German system, it is not unheard of for meetings to stretch to three hours or more, and two hours is not an unusual length even when there is comparatively little business to be handled.

Since Covid-19 has forced most operations online, my tasks have typically become more arduous. Teaching takes more preparation. Getting a book from the library requires reserving a slot weeks in advance. Even something as basic as signing forms requires printing, scanning and file management. But the truncated meetings are giving me at least a portion of that time back.

Some of the reasons for these faster meetings are practical. There is no travel time required between campus buildings, and no coats to hang up once we’re there. Because of privacy regulations, we are not allowed to name individuals who are not in the same Zoom session, so many documents that might usually be read out loud are simply distributed to the committee beforehand and presumed read. The only real loss is not being able to chat with colleagues before and after the meeting proper, but in pragmatic terms, that, too, leads to a much shorter appointment.

There are other, more subtle, reasons for our newly streamlined meetings, too. Most people I know find Zoom miserable, and are disinclined to stretch out yet another exhausting online session by artificially extending the discussion on any given point. Once the main arguments have been stated, everyone is willing to move on, with no grandstanding. Tired of staring at our screens, my colleagues and I are turning to phone calls wherever possible so we can avoid both video calls and endless email chains. As the rest of our work has expanded, more of us are becoming protective of our time.

If we can improve how we do meetings without even trying, perhaps we could also do so intentionally – including when we are allowed to meet in person again. We might ask ourselves which of our meetings really are necessary, and cancel those that are simply held for form or to discuss issues that could be resolved with a phone call – especially if they are scheduled outside standard office hours.

Simple procedural votes can be done via email or online poll. More documents can be pre-circulated, so that the meeting time can be used for focused discussion. Meetings where the unspoken social interaction between participants is not particularly important can still be held via Zoom, thus saving participants the travel. And all of our meetings should have a scheduled end point, by which time they must end.

But what about the social element? Instead of whiling away our lives in administrative meetings, we could spend our time doing things with our colleagues that are engaging and intellectually fruitful. Reading groups. Colloquia. Discussing ideas and working on research problems.

It may sound like a pipe dream, I know. But I have beaten the odds before.

Irina Dumitrescu is professor of English medieval studies at the University of Bonn.


Man jumping above houses
Source: 
Getty
 

Decalcify the curriculum

I’ve long been frustrated with the lockboxes of courses and syllabi. Some years ago, I wrote a piece called “Against Reading Lists” for The Chronicle of Higher Education suggesting that, given how easy it is to find texts on the internet – or buy them via Amazon – it makes sense to have flexible syllabi that can shift mid-course based on the morphing and developing interests of students and teachers. My university, perhaps in response, sent a memo out the following week requiring all faculty members to submit their course syllabi to the dean and regard them as an unbreakable contract between student and teacher. The dead hand of tradition weighs heavy on the habitus of academia.

But is there actually a problem with the current idea of “the course”? To the concept’s credit, it encompasses a fixed period of time with a set number of students and a knowable set of texts (at least in the humanities). That predictability, one could argue, allows for shared expectations, distinct assignments, pre-measured course credits and some kind of comforting continuity. On the other hand, it also may lead to professors and students phoning it in as they retread too-often-stated ideas.

In the worst of worlds the course can feel like a jail sentence for students who are beyond the add/drop deadlines (after which you can’t add or drop courses), with unnegotiated requirements and a dictatorial leader who tolerates no dissent. In the best of worlds, on the other hand, a 15-week course would provide new insights at every turn for students and creative opportunities to try out ideas and theories on the part of professors.

We now have the technology to make that world a possibility. In my modest proposal, the old, fixed carapace of course and syllabus is sloughed off and replaced by a matrix of courses or presentations (of varying durations) through which a student can journey, via the internet or in person.

Take, for example, one student – let’s call her Conservative Cheryl – who decides to take the usual run of courses in a semester. She attends Professor Pompous’ class on Victorian fiction. She also takes Professor Eccentric’s speculative fiction course and Professor Kool’s class on cyberpunk novels. Meanwhile, another student, Experimental Eric, decides to take the first two weeks of Pompous’ class, the next three weeks of Eccentric’s course, and finishes the semester with Kool’s seminar. Eric will have the advantage of having tried out a number of approaches, so that the following semester he might decide to stick with one professor for a longer haul. Cheryl, however, has no such flexibility – much as she might have liked her classes.

In addition, academics could give free-standing lectures and presentations or very short courses that would count toward a student’s credits, just as medical doctors routinely get continuing credit for attending lectures. This would intensify the intellectual life on campus by creating non-course content that would attract larger number of students. It would free up professors to try new ideas out in, say, three-week courses.

One of my colleagues asserts that students don’t actually take courses for the content but rather for the professor. While I don’t necessarily agree with that analysis, it is clearly true that the success or failure of a course will largely depend on the innovation, clarity and personality of the academic. This system I am proposing would allow for those qualities to be front and centre. Professors could see how they were doing by tracking feet on the ground and posteriors in seats as students created their own game plans for their education. 

We have mostly given up on rigid distribution requirements, obliging prospective majors in particular subjects to take a certain number of courses in certain areas. Might it not be time to go further and abandon the archaic infrastructure of the fixed course and syllabus entirely? Should we not embrace the more rewarding flow of data, information and knowledge that is characteristic of our age and technology?

Lennard Davis is distinguished professor of English at the University of Illinois Chicago.


 

Skydiver smashing a target
Source: 
Getty

Impact assessment

Before the pandemic, I typically walked into my classrooms with nothing but a book, a cup of coffee and a small notebook filled with hand-written lecture notes. I discouraged the use of laptops in class and I avoided PowerPoint. I helped students navigate Hamlet, Emma and Paradise Lost with nothing but a whiteboard marker – and maybe sometimes a DVD.  

Today, I have my own YouTube channel, where I post video lectures for my students. I record and edit course-specific podcasts with my colleagues, and every week I post assignments and manage discussion forums on an online learning platform. My wife notes with bemused fascination how my cranky techno-scepticism has been so dramatically swept away.

I may have adopted these tools reluctantly, but I see now how they can help us reach new audiences. I do not believe that the current disruption signals the end of traditional education: there are too many advantages to the old model, and students overwhelmingly prefer classrooms to chat rooms. Still, I do think that the rapid, widespread adoption of digital tools opens up new avenues for what the granting agencies call “knowledge mobilisation”.

The relative inaccessibility of academic research and expertise is a longstanding problem. There are financial barriers for the average person: paywalls for scholarly journals and high purchase prices for academic monographs. The slowness and, in some cases, the fickleness of peer review processes also hamper the dissemination of knowledge. And the stylistic and generic conventions of scholarly writing can make our work impenetrable to all but a select few. The result is that governments and the general public do not always understand what we do or how we contribute to the public good.

Empowering academics to speak to different and broader audiences more easily could help address this. These new tools allow experts to address emergent issues and problems more quickly, and to speak to local concerns or specific constituencies with more agility. And there is clearly an appetite for it: some academics have capably used social media to this effect for years already, building large audiences online while sharing their research or providing valuable scholarly context to the news of the day.

One of the challenges consequent to these new practices would be measuring and evaluating new media scholarship for the purposes of tenure, promotion and hiring. How would we evaluate a colleague’s expert contributions to a popular current events podcast, for example? We already grapple with similar questions when evaluating professors who contribute to their local newspapers, or whose expertise in ecology or anthropology is solicited by governments. But our sudden adoption of proliferating new media tools for professional purposes promises to make these evaluations more varied and complex.

I can sense my fellow techno-sceptics rolling their eyes at the prospect. Why must they bother with all this? Isn’t it overly optimistic to imagine that showcasing the kinds of work we do in our classrooms, libraries and laboratories will increase appreciation of its value among those outside universities’ walls?

Perhaps. But, given the rise of anti-intellectualism and the spread of conspiracy theories, I believe that we need to defend our universities at every opportunity, with every tool we have. We are in no position to look askance at such opportunities.

We have been using technology throughout the pandemic to preserve the things that are most essential to us. We must make best use of it to do the same in the post-Covid era.

Andrew Moore is director and associate professor in the great books programme at St Thomas University, New Brunswick.


 

Man holding a laptop like a book
Source: 
Getty

Teaching expertise

As the pandemic forced campus closures last year, seasoned professors who had always ignored online delivery or delegated digital components in their courses to junior casuals were forced on to online platforms as their sole means of course delivery. They found themselves teaching online, often with only rudimentary understanding of the technology or the technical skills required, and with nil training in online pedagogical methods.

Then again, while the speed of transition may have been unprecedented, academics’ lack of adequate preparation was not. There is, after all, a near universal lack of teacher training for university teaching staff. It has always been a matter of principle that academics are recruited based on their research potential and specialist discipline knowledge: no qualification to teach is expected, and any question at interview about teaching philosophy or practice is often regarded as trivial.

In a recent US survey, while 72 per cent of university students expressed a preference to return to face-to-face teaching and the vibrancy of campus life as soon as pandemic restrictions ease, more than 50 per cent also indicated that they would welcome at least some digital components in their courses regularly in future. University staff, too, have now seen some of the benefits of online delivery: they are already debating what the right ongoing mix of face-to-face and online modes might be. Many have also become aware of their lack of knowledge and skill in online technology and pedagogy.

Unmistakably, this – alongside universities’ new spirit of agility and innovation – creates a climate in which universities might end their devotion to appointing academics who are unqualified in teaching.

The PhD is the fundamental research qualification for an academic appointment, but imagine if a standard part of its coursework became a unit in teaching skills. Imagine, too, if, in ongoing staff development, all teaching academics were coached in online instructional design and learning practices, experienced current digital tools, and debated – based on empirical evidence – the right blend of face-to-face and online elements and balance of synchronous and asynchronous modes for their specialities.

In the increasingly hybrid learning environment, many universities will need to expand IT development and support for their virtual learning environments, as well as encourage the establishment of staff teaching support networks, with incentives and awards for good practice. It would then become standard for academics to discuss how to motivate students online, how to promote deep rather than superficial understanding of digital content, how to give online feedback and how to conduct online assessment. It would also stimulate debate on how to develop in students the judgement needed to apply a critical eye to the unmoderated and often unsafe online world.

Plausibly, it is also now time to consider how to promote research into cognitive learning more broadly, so that the face-to-face elements of teaching can also be based on the latest and most rigorous research.

A major shift of attention to the pedagogical preparation and development of academics would only sharpen the focus on teaching – which, after all, remains a core mission of universities the world over.

Warren Bebbington is a professorial fellow of the L.H. Martin Institute at the University of Melbourne and the former vice-chancellor of the University of Adelaide.

POSTSCRIPT:

Print headline: Changes for good

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Related articles

Reader's comments (2)

About a half century or more, there was a movement to develop innovative practices in K-12 education and a small number of efforts to change the structure of post secondary education, including that of the university. Post baccalaureate systems, due in part to promotion and tenure has been slow to change with the easiest path being to proliferate the number of refereed and ranked journals, in part due to technology and Eugene Garfield's creation of the Institute of Scientific Information and further developments. There is one rather large volume focusing on the latter, The New Ph.D. which points out that the numbers surviving these programs has thinned and those who make it thru rarely achieved a post in an R1 institution. That makes the anecdotal information on baccalaureate teaching in this article almost historic. Academia has been slow to address the platform economy mostly seeing these as tools rather than the implications for systems change. Much that was understood in that past has only been recolored as innovation in the present. One volume on the future of the HEI's, particularly the post secondary programs, admonishes not to sacrifice the Queen (meaning the humanities writ large). While that might read not to drop such programs. In fact the greater insight is that, in a default, techno-driven world, it also implies a fresh look at the HEI's themselves as institutions in form, function and practice
"Academia doesn’t change easily, and many of my colleagues were sceptical that we would be able to adapt so quickly to this brave new world of online instruction" In this instance the interests of staff and management coincided. But the behaviour of institutions is determined by the policy landscape, not by what staff want. Why do university scientists falsify results? Because the prestige and good publicity following from their falsified results help the institution to navigate the policy landscape, to compete with other institutions, so it will reward the researchers - promotion, security, etc. If the policy landscape shifts slightly so that the short-term gains from spectacular but false results are outweighed by the negative consequences of discovery, the desires of the institution and of the academic staff will coincide and change will be relatively easy. If the institutions have no incentive to stop rewarding false results, things are much less likely to change. Eliminating the REF? Those exercises cast the landscape in black and white so that every institution believes it sees clearly how to maximise its chances of winning, of climbing higher in the landscape. The desires of academics will carry no weight in that matter.

Sponsored