Wednesday 11 September 2019, 1:30 PM-2:45 PM (75 Min )

Scholarly publishing seems to have been poised at the threshold of the open access revolution for over twenty years. While there has certainly been tremendous progress over that time, we have yet to realise all the potential that appeared to be within our grasp at the outset of the 21st Century. Part of the difficulty is that the community has become entangled by the metricisation of research assessment. Unpicking this devilishly knotty problem is the task that DORA – among others – has set itself. But it is not enough just to expose the problem. We have to focus on providing workable solutions.

Learn more about Dora at https://sfdora.org.

Speaker

Prof. Stephen Curry 
Assistant Provost (Equality, Diversity & Inclusion), Professor of Structural Biology, Imperial College London, and Chair, DORA
Imperial College London

Stephen Curry is a Professor of Structural Biology at Imperial College London where he also serves as the Assistant Provost for Equality, Diversity and Inclusion. For many years he has been a writer and campaigner on a range of scientific issues including open access, research assessment, research funding and science policy. He is currently chair of the San Francisco Declaration on Research Assessment (DORA).

Download slides:

Keynote – Stephen Curry

View Transcript
[00:00:15.98] WAYNE SIME: Well hello, good afternoon, and welcome to the ALPSP Conference 2019. My name is Wayne Sime. I’m the ALPSP chief executive, and it gives me great pleasure to welcome you all here today. Now we’re going to begin, as all great conferences do, with housekeeping and notices.
[00:00:36.19] So there are no fire drills planned over the next few days, but if you do hear a fire alarm, please move to your nearest exit, which is either in the Hanover lounge or main reception. Details on how to access the Wi-Fi, which is kindly sponsored by Aries, can be found on the front of your conference pocket planner. Just for your information, the username is Aries followed by the password Aries19.
[00:01:04.06] Please don’t forget to tweet about the conference– hashtag #ALPSP19. Toilets can be found near the Beaumont restaurant just down the corridor from this room. Smoking is not permitted in bedrooms or in the venue, however there are smoking areas located outside the Hanover patio, or outside the main reception. There is also an area near the Beaumont restaurant.
[00:01:30.16] Please be aware that all sessions will be recorded for video or for audio and used on various platforms, including the ALPSP website. Please avoid making comments or criticisms regarding named individuals’ organizations in favor of more general comments. If you do wish to opt out, then please let one of the ALPSP team know. If anyone needs help or advice with anything, please ask a member of the ALPSP team. This year, they’re all wearing very fetching grey polo shirts, and also have green ribbons on their badges.
[00:02:06.43] ALPSP council members have red labels, and I am confident that they will also be able to help you. Please don’t forget about the ALPSP AGM– it’s tomorrow at 12:20 in Hanover three. I think I must give– I mean obviously, we’re very grateful to all our sponsors. But in particular, I’d like to mention PLS, who are our platinum sponsor, CCC for the welcome reception for this evening, and at Atypon for the conference dinner.
[00:02:36.52] We’re delighted to be raising funds again for FODAD, the Friends Of Della And Dawn. For anyone who is not familiar with this charity, it was set up to help a small fishing village in Sri Lanka that was devastated by the Boxing Day tsunami in 2004. Over the last few years, the generosity of our conference delegates has helped raise funds for a new water tank, a fishing boat, and even a teacher for the village. So if you’d like to help raise money this year, we have a fun run walk tomorrow setting off at 5:00 PM from main reception. Please just ask one of the ALPSP team if you want to sign up.
[00:03:16.70] Now I’m sure you know that ALPSP is a friendly network of people, so please do you remember to chat to anyone who you see are alone or attending the conference for the first time. After all, it’s good to talk and for everyone to feel part of our community. Now I’m pleased to say that’s the end of the notices. I’m sure you’re as relieved as I am. It’s now my great pleasure to introduce you to the person who is going to get the conference going, Sarah Faulder, the chief executive of PLS. Thank you.
[00:03:46.91] SARAH FAULDER: Well thank you very much for that introduction, Wayne. I’m honored to be getting the conference going. I’m really delighted that Publishers’ Licensing Services is this year’s platinum sponsor. As a not-for-profit organization, PLS is dedicated to serving the interests of publishers. And we’re here to support you, particularly in the transition to open-access which is obviously the big theme of the conference.
[00:04:15.83] We’re very pleased, for example, to have already put in place some five years ago the access to research service, which provides free online access to over 15 million academic articles in local public libraries. And that’s with the full support of publishers. And obviously, it’s a stepping stone on the road towards full open-access publishing.
[00:04:41.27] Now I’m really here to introduce our keynote speaker today, professor Stephen Curry. Stephen describes himself as a researcher interested in molecular biology by day, and an active science campaigner and blogger by night. So his nighttime activities have brought him into the heart of the debate on open access, where his interest in the role of science and society has drawn him into science policy, R&D funding, research evaluation, and scholarly publishing.
[00:05:17.90] In particular, he was on [INAUDIBLE] steering group in its review of the role of metrics in research assessment– the theme of his address today. Stephen’s daytime work as professor of structural biology at Imperial College has seen him make an important contribution to research on viruses, including the foot-and-mouth virus and norovirus. In addition, Steven has made time to direct the college’s strategy on equality, diversity, and inclusion for staff and students in his capacity as assistant provost.
[00:05:55.10] So I am delighted to welcome Stephen, and I know from his tweets earlier today– on his way here, in fact, in which he was wondering whether there was a green Egham– that whilst he has a lot to talk about, he’s also very keen to allow some time for you to ask some questions. So over to you, Stephen. Thank you very much.
[00:06:17.00] STEPHEN CURRY: Thank you very much, Sarah, for that very kind introduction. And thank you very much to the organizers for the opportunity to speak to you this afternoon at your conference. As Sarah indicated, I wear a number of hats, or have taken on a number of hats over the years– not just as a jobbing academic for the last 25, 30 years– I’ve lost count, actually.
[00:06:39.92] But also, I’m very much interested in the business and culture of science, and that’s actually, these days, where most of my interest lies– not just in thinking through the equality, diversity, and inclusion challenges at Imperial College– and it was very nice to hear a very inclusive and welcoming opening right at the beginning of this session– but also thinking about many areas of research assessment. And so one of the other hats I’m wearing today is as the chair of DORA, which is the Declaration On Research Assessment.
[00:07:13.13] So I have been thinking and writing about these topics– open access, scholarly communication, funding, public interest in science, the cultural science– for 10 years. And I would like to be able to promise you that what I have waiting for you is a mature, and well reasoned, and slick, and fluent presentation which embraces all these topics, and provides a very neat solution at the end. Please do not assume that that’s what’s going to happen.
[00:07:44.55] I want to talk a little bit around a range of different subjects– talk about principles and philosophy of what it is that we’re here for– but also to think about how those present us with different challenges because of the way that we have come to assess researchers, which is in many ways a deeply dysfunctional system. Not through any fault of any one individual or organization, but because we have developed, I think, a sort of market-driven approach to the demands of the needs of society that isn’t always working in society’s best interests.
[00:08:18.00] I hope also to say a few things that will be germane to an audience of publishers. I don’t often– I do sometimes get the chance to talk to publishers. I know the relationship between publishers and academics is in some cases not an easy one, but I would like to reassure you, also, that I am very interested in talking to all sides on what is a very important debate and a very important topic.
[00:08:41.25] So without further ado, let me start, and let me introduce you to Aryanna. So this is Aryanna sitting on her mother’s lap. And Aryanna was born in Brazil, and while her mother was expecting her daughter, she was infected by the Zika virus. And that has resulted in Aryanna having the condition known as microcephaly, which is, as we now know, strongly linked to Zika virus infections during pregnancy. So Aryanna can look forward to a condition which has widespread tissue and cell death, brain shrinkage, impaired cognition and motor functions, hyperactivity, seizures, retinal lesions– in one third of cases, which may leave her blind– and she may need lifelong intensive care.
[00:09:34.14] And it’s arguable, I think, that our system of scholarly communication and research assessment has resulted in this child suffering from this condition. In the run up to the Rio de Janeiro Olympics, there was an initiative by the Welcome Trust, which got together a bunch of publishers and various different hues– commercial and non-commercial– and got them to agree to allow researchers to share all their research on Zika virus and the consequences of infection prior to publication.
[00:10:12.11] They wanted to get the [INAUDIBLE] out as quickly as possible so that the whole community could work the problem. Here was an important public health issue that needed to be solved. And as a result, a clear linkage between Zika virus infection and microcephaly, which was a sort of novel feature of disease that it had migrated across the globe, was established.
[00:10:33.18] But they had to do that because our normal process of sharing research information involves endless delays, because it is linked so heavily to prestige, to career advancement, to funding decisions. And so what we have is a research assessment system that is working against the public interest. Not just in the case of Zika virus– so this was an initiative that related specifically to that health condition– but many more people are affected by malaria in the world, or TB, or HIV infections, and yet we don’t have an initiative that covers those infections.
[00:11:11.58] There are other areas where very important research is going on that is of deep, deep public interest. Climate change, the need to decarbonize our economies, the need to think about water supplies, food supplies, for a growing global population. And yet, here’s an example of what we do in academia, which is enabled by the whole system that we are all involved in.
[00:11:34.71] This is an advert that was posted by a relatively new member of staff at the ETH in Zurich, one of their premier research institutes, looking for a postdoctoral position. A specific requirement is to have publish as a main author or co-author in a high impact journal– impact factor above 10, it could not be clearer, OK? And if it didn’t get the message, applications and not fulfilling the latter requirement will get a rejection. Now, slightly disturbingly, the ETH Zurich is a signatory of DORA, so this is the sort of stuff that should not happen.
[00:12:08.62] The PI, who was relatively new to the Institute, was actually called out about this on Twitter, and actually, I did write in my capacity as chair of DORA to the president of the ETH, and they were very quick and very constructive to respond. The PI has apologized, and hopefully it learned a lesson. And the ETH, I think, is now looking again at how it communicates the values of the institution and its commitment to DORA, which is not an easy thing to do.
[00:12:38.32] I appreciate myself, from working at a large organization, it’s very hard sometimes to reach all parts of the organization– I cannot be in every committee room and calling out poor behavior. But this is the culture that early-career researchers are growing up in, and it’s one where I think we have lost sight of what’s really important. And that’s a terribly sad thing in my view, because I think many young people come into research because they want to understand the world, they’re curious about the world.
[00:13:08.95] But a very important part of that for many, many people is that they want to use those discoveries and that new understanding to make the world a better place. And I hope that that is something that we all share. I think it is a deep-seated human aspiration articulated here by Victor Frankl, who survived the Holocaust, and saw the very best and the very worst of what humanity can do. “Don’t aim at success, for success, like happiness, cannot be pursued. It must ensue, and it only does so as the unintended side-effect of dedication to a cause greater than oneself.”
[00:13:48.83] Now I would hope that we all agree with that sentiment in this room. I will not ask for a show of hands. And I think it probably does speak to all of us. But if you’re a vise chancellor of a successful university, you might turn around and say, what am I going to do with that? OK, how do I manage a university with that? The Times higher education university league tables are coming out this afternoon, and they will have the attention of university leaders around the country, and around the world.
[00:14:18.84] But this is a core value, I think, at the heart of what we are all doing– all of us involved in research, in scholarship. It is about making the world a better place. But we lose sight of that– this is a quotation from a book called The top five regrets of the dying. Every single person in this room will die at some point. Some of us sooner than others. Speaking for myself, “I wish I had the courage to live a life true to myself, not the life others expected of me.”
[00:14:53.07] But of course, if you work in academia, you are constantly fretting and worrying about what others expect of you, and what the system expects of you. So this is what is expected of academics. Here’s the list of impact factors of journals updated annually. Here’s the World University Rankings from 2013-2014. More or less the same names will appear later this afternoon. But we are boiling everything down, too often and too much, to a bunch of numbers– to a bunch of metrics.
[00:15:24.81] Now there is some interesting information in there– I’m not going to deny that– but the numbers have taken over in a way that is really rather troublesome. This is not something that’s unique to academia, and indeed, I think it might be argued that there is a broader movement– certainly within Western market economies in the way that societies and governments run things, and businesses, and industries– which is to metricize it– to manage it by the spreadsheet.
[00:15:53.60] There’s a very good book that analyzes this across a number of different sectors– not just academia, but also looking at the police force, and the way that hospitals are run. And we have what Jerry Muller calls a new managerialism– we have generalists, not generals. So we have people who think– professional managers who can come in and run in any organization, as long as they have all the information. You give me the spreadsheets, I can manage this organization, and I can make it work better.
[00:16:18.63] And so you end up with a kind of professional managerial class who are good at managing but not necessarily good at understanding the nuances and the culture of the organizations that they’re leading. And understanding cultures by understanding people, all the time, and I think we have lost sight, really, of the impact that all our efforts to measure and to manage things have on the people that work for these organizations.
[00:16:45.39] So metrics can be useful– I’m not here to tell you that quantitative information is not helpful. I am a scientist, after all– that’s meat and drink to me. But they are not useful if misapplied by people who do not understand their context– and that is a problem that we have sometimes in the higher education sector– and not if tied too tightly to extrinsic rewards. And I think we want to try and make sure that we re-access the intrinsic drives that have actually brought most people into a life in scholarship in the first place.
[00:17:15.99] Our difficulty is that we have a focus on measurable performance, and that’s what’s driven by this obsession with metrics and using quantitative information to make important decisions. And that information is useful, but it leads managers to neglect tasks for which no clear measure or performance is available– so the intangibles. And I think all of us have got things in our life which are intangible, but which are vitally important to us.
[00:17:42.98] So it is easy to criticize metrics– and goodness me, I have done it. I could bore for Britain on the subject of impact factors, starting back into in 2012 on my blog– this is still one of my most read blog posts. But we know– and I’m sure many people in this room who’ve thought for more than five minutes about this are aware of the sorts of problems that we have.
[00:18:07.62] So if we have a valuation based largely on journal metrics– and that’s one of the major problems that we are grappling with, as you will have seen in that advert that was supposed by the ETH, Zurich– the chase for the impact factor slows down publication. In my own case, my worst offense is to submit the same paper to five different journals before it got in. So Nature, Science, [INAUDIBLE]– did not see or agree with my brilliance, but it did eventually get published.
[00:18:35.42] But I spent a year not publishing information, and I don’t do that anymore. We have a positive bias in the literature, because of course, the impact factor is an important marketing tool for any journal, and they are going to want to make sure that they publish papers that will get citations. But that creates I think conflict of interest with the greater good. There’s no place for sharing negative results, and negative results are an important finding, but they simply get lost, because there is no recognition of their value.
[00:19:08.00] We have metric-driven hypercompetition– only the result matters. It’s the paper that matters, and getting into Nature Cell, Science, et cetera– those are the things that increasingly I hear from PhDs and postdocs– think that is what they need in order, simply, to secure a career. The intensity of the demand is, I think, even more acute than I felt even 25 years or more ago, when I was setting out on my own research career.
[00:19:36.85] So that incentivizes fraud. We know that the one thing that correlates most strongly with impact factor is retraction rate. And we know that fraud is still happening, and it’s human nature. This is a result of Goodhart’s Law. And I know it’s still a small minority of cases, but that undermines reliability and public trust.
[00:19:59.23] And even if the public were to realize that in the chase for prestige, and in the chase for advancing our careers, we agreed to a system that slowed down the sharing of publicly funded research results, I think they would be horrified, and I think that would be a great loss of trust in the research community as a result. We are lucky at the minute that most people don’t realize that that’s what we do, but that is what we do, and I think we have to stop doing it.
[00:20:27.46] We have a focus on research output– it’s really the paper that matters, not the person, necessarily. We don’t think about the intangibles that are actually harder to capture– the real-world impact, the mentorship that happens, the training of young scientists. These are very important outputs, but they’re not captured in our traditional systems of evaluation.
[00:20:48.86] And there are painful consequences, then, for researchers– stress, anger, envy. It’s not a life that I would necessarily recommend to young people. I was telling someone at lunch I have three grown-up children, and none of them have gone into academia. And that is actually a source of relief to me in some ways, but I think it shows that actually, they are smarter than I am. But probably they haven’t gone into it because they’ve seen the impact that it’s had on me over the years, and they have a much healthier attitude to work-life balance as a result. So I am hoping to learn from them in the years to come.
[00:21:25.15] I would recommend also to you this paper Excellence R Us by Cameron Neylon, and Martin Paul Eve, and various other co-authors, which examines this concept of excellence, which I think is something that– we talk about excellence all the time at Imperial College. Everything has to be excellent. It’s excellent research, excellent teaching.
[00:21:43.33] And excellence is important, but it is a slightly suspect concept– one that’s difficult to pin down, and one that often tends to be exploited to reinforce the existing establishment, and existing hierarchies of prestige. And that works against, then, the broader interest, and these authors, then, argue for a concept of excellence, or an alternative rhetoric, which is based on soundness. It’s much more important to produce reliable research than research that gets you into the top journals but isn’t wholly reliable.
[00:22:17.20] And I just noticed– actually, I read it on the train coming up here, so I have actually modified my slides since I told myself I’d finished them this morning. So this was published yesterday– Jeremy Farrar. And it is very much taking up that message, and questioning our ideas about excellence. We need to reimagine how we do research. The emphasis on excellence in the research system is stifling diverse thinking and positive behaviors, because we have got into a vicious circle that is reifying the published paper, and leading to all sorts of detrimental effects in terms of how we share research, how open we are about it, and our behaviors as well.
[00:22:56.83] And it tends, also, to homogenize the academy. When push comes to shove, if you’re in a very competitive environment, you will hire people who look like you, or you will hire– which basically means, in many cases, you will hire a white man, because hiring a white man in the past is the thing that worked. And we know from the academy, and I know from the demographics of our own university, and from many others across the UK, that we still have big challenges when it comes to gender equality and to diversity, and to being an academy that truly reflects and involves the whole national and international population.
[00:23:31.76] And so here, this is a very honest piece by Welcome, and I think this is a really important statement. And certainly other funders and UKRI have started to put an increasing emphasis on this, in thinking about the people who are involved in the system and not just the product. The product is important, but it’s the people that matter as well. And so it’s about coming back and thinking about things that are greater than oneself.
[00:23:55.67] And I think the move to openness is really a part of that. And it’s no accident, really, that the drive to open access has caused people to rethink modes and mores of research assessment, because it’s about creating new spaces for new types of publication, new ways of thinking about how we communicate, and very much ties into the idea of making sure that everybody is involved, or everybody has access to what is, in 3/4 of cases, publicly funded research.
[00:24:27.82] Now in the academy, you will hear statements like this– I’m all in favor of open access– and most academics– they get it, they see the point of it. But what about my career? What if I can’t publish in the journals that I want to publish in, or that I need to publish in, in order to get the recognition that I deserve? People can’t trust the system to judge them on what they’ve done– they will look at the name of the journal first.
[00:24:50.90] What about the learners assignment? I appreciate– and maybe we can get into that later in the discussion, and I am actually determined to finish well before my hour is up so that we can have some of that discussion– what about the costs? We know that there are many challenges there. What about predatory journals, which are a potential risk, although I think one that could be easily dealt with through openness. If you had open peer review, you would kill every single predatory journal in the world overnight.
[00:25:19.08] So we have a system here that is working very much in tension with the greater good. And someone else who’s written very eloquently and passionately about this is Frank Miedema, who’s at the University of Utrecht in the Netherlands. So “despite personal ideals and good intentions, in this incentive and reward system, researchers find themselves pursuing not the work that benefits the public”– and Frank is largely a medical researcher– works in HIV– “or preventative health or patient care the most, but work that gives the most academic credit”– and we are really focused on academic credit, and that is proving to be a huge drag on the system and the move to open access. And I think also, it is damaging, then, our relationship with the societies in which we are embedded, and on whom we rely for funding.
[00:26:07.15] So I think– and I know the theme of this conference is thinking more about open access and open science as the broader agenda, and clearly it’s the direction of travel. I think in many– certainly in the UK and in Europe, if one looks at the science policy, then clearly that is the direction of travel– we’ll get to plans in a moment. But I think broadly, the impulses and the motivations of the open access in the founding declarations are good ones, are workable ones– but clearly we have been struggling to put them into place.
[00:26:40.80] There has been a huge amount of progress, and I know there are a whole spectrum of reviews among people who are known as open-access advocates. I count myself with open access advocate, but I’m not the same as some of the people who are perhaps more in the vanguards, saying we just need to tear down the system and start all over again, and just do it really cheaply– $10 a pop. I think it’s a bit more complicated than that.
[00:27:04.84] But I think it’s the way that we have to go in order to make sure that we restore a system that speaks to our values and that speaks to the values of the societies that we want to serve. So there have been many innovations. I’m a life scientists– preprints is something that is relatively new to life sciences.
[00:27:22.32] There was actually an early experiment in the 60s that was actually killed off by a bunch of publishers– there’s an interesting paper in PLOS Pathogens or PLOS Biology by Matthew Cobb that explores that history. But reprints have come back really quite strongly in recent years led by bioRxiv, but there are many other preprint servers arising in many other disciplines.
[00:27:44.70] And I really like the idea preprints. And in my lab, I’ve adopted it in recent years as a standard practice. As soon as the manuscript is ready and ready to go to the journal then we will preprint it. It’s faster communication. I like the subversive nature of it– because there’s no journal name on a preprint. You have to do with the title of the paper, the names of the authors, and the abstract if you want to make a decision about whether or not you’re going to read any further, or that this is useful to you.
[00:28:09.76] So the focus is on the content, not the container. And I think increasingly, people are looking to the bioRxiv server. I heard someone complaining on Twitter yesterday morning, because the server was down and they were hungry for their daily fix. And I know that in many fields of physics, where a preprint server has been in operation for much longer– in many fields, the first thing they do when they get in the morning is to have a look at the arXiv server to see– have they been scooped? And if they have been scooped, the protocol, as I understand it, is that they’ve got 24 hours to get their paper out, and then it will be considered a joint publication. But it’s an important venue, and it is one where good work is seen to be published.
[00:28:47.82] I am a believer in peer review, but I think also as a community, we tend to reify peer review– we don’t often enough acknowledge the flaws with it. As an author myself, I like to think that actually 80% to 90% of the value in my paper is in the paper that I submit to the journal. I have often had very helpful comments from referees. In some cases, not so helpful. But rarely has it– thank goodness– found a massive logical flaw which means that the whole edifice crumbled in my hands.
[00:29:24.97] But peer review is not a guarantor of quality or excellence. It is a useful feedback service, but I think we need to recognize that actually, if people with good intent are posting to preprint servers, then you know there is good material there. And I think we’re learning that and seeing that, certainly in the biomedical areas, with the rise of bioRxiv.
[00:29:47.73] The value of it is, of course, that it also gives the largest possible audience, so we get wide sharing. And of course, that increases scrutiny, which hopefully also then increases reliability. And in many cases, the wider readership will pick up flaws that referees did not spot, because there are a finite number of referees, and they can’t be expected do everything. And if they’re anything like me, they’re often fitting this in around many other tasks at the same time.
[00:30:15.28] And of course, that sharing and scrutiny applies to open-access papers, and that’s one of the real values of open access. I think the idea of preprints also encourages new thinking about open peer review, which is still very much a minority activity among many journals. I favor it very strongly.
[00:30:40.23] There are different modes of open peer review. I agree with some concerns that if early-career researchers are reviewing the papers of more senior and established researchers, they may not want their name shared, because people are human, and sometimes if you are critical, there may be retribution which you can’t control for. But I would at least like to see anonymous reports published, because I think that encourages professional behavior on behalf of reviewers, and it adds the richness of the dialogue surrounding a paper, and so there is greater transparency about what a scientific paper or a research paper actually is.
[00:31:15.13] The motivation for preprints– very much in line, of course, with open access– and in line with that too is the idea of data sharing. And I know that many journals now absolutely require data to be shared as part of the paper, and then it’s simply not good enough if you are going to publish a paper and draw conclusions, but not apply the data themselves to be properly interrogated.
[00:31:35.82] That’s an important innovation. I appreciate our huge technical challenges associated with that. I work in the field of structural biology, and we have, fortunately, a data repository that’s been going since 1976, which is free to use, and there’s a very standard data format that’s supplied. And there are huge challenges with storage, and accommodating data formats, and actually sharing data in a way that is truly reusable in many different fields. So that’s not a problem that’s going to be solved overnight, but clearly part of the direction of challenge.
[00:32:06.15] And all of these practices– preprints, open access, data sharing– software sharing as well, I could mention– these are all better adapted for changing the world to make the world a better place. And that, again, speaks to our values. And so the arguments for open science and open access, I think, are incontrovertible.
[00:32:25.38] But nevertheless, adoption needs to be incentivized and facilitated. Academics are busy people we’ve got to try and make it easy for them– make it frictionless as much as possible. But we also– and that’s why, when open access came along, I think that did stimulate an awful lot of thinking about how it is that we reward people for what they do.
[00:32:46.68] So I think one can make a very powerful case for open access. It hasn’t hit home in every lab or in every discipline, and this is a piece by Robin Osborne, who’s a historian at– Oxford and Cambridge, I can never tell those two twins apart. “Academic research is not something to which free access is possible. It’s a process that universities teach at a fee”– in the UK– “for those who wish to have access there is an admission cost. They must invest the education prerequisite to enable them to understand the language used.”
[00:33:17.94] So here is a scholar who thinks that what he writes is so full of genius that ordinary people cannot understand it. And I disagree profoundly with this view. there is a lot of academic research published which pretty much nobody understands, but I think this massively underestimates the different capacities of the public. And it’s a mistake to think of the public as one homogeneous group of people who didn’t go to university.
[00:33:44.15] There are many members of the general public who did go to university, and who are intelligent, and even people– there are people in this country who didn’t go to university but are smarter than the people who did go. So there’s a huge range of different audiences out there, many of which are more than capable of engaging with the literature.
[00:34:03.44] And even if that were not the case, I think that making open access is important simply through democratic accountability. And I’ve written about that a little bit in a chapter in a book called The Science and Politics of Openness, which is an open-access book, so you can download it from the URL on my slide, and I’m very happy for the organizers to share my slides. It’s very hard to write down these sorts of things because they’re not very memorable.
[00:34:31.40] So here I was exploring the value beyond the academy of open access. And one of the things that I think is important about it is providing that public accountability. The public are paying for it, and we should be thinking about making it available to them. I’m not suggesting we write it in lay language, because clearly there is a technical language associated with many disciplines that is part and parcel of the communication.
[00:34:54.29] But if we make it available, then there are other people who can interpret it or rewrite it– lots of bloggers out there– that can make it more available– journalists, if they have access to it. It provides the information. But even so, there are members of the public– and I’m thinking particularly, for example, of patient advocacy groups– who are often very expert and very knowledgeable about the current literature, and very, very, highly motivated to know and understand what’s happening next, and even to start trying to help direct the sorts of questions and the sorts of problems that medical research charities, and even research councils, might think about.
[00:35:31.64] So openness is a two-way street, very much. And I think one of the interesting challenges for us is making sure– and it’s one of the pressures and pulls that comes from open access– is opening up the academy to the pull from the public– engaging properly with them, and understanding, from their point of view, what they would like publicly funded researchers to study.
[00:35:56.66] My own experience of doing some public engagement work with children is that children are by far the best people to talk to when it comes to asking really tough questions of researchers, because they don’t have any baggage, or any pretenses or airs of being clever. They cut to the quick and ask really tough questions about– how are you going to solve cancer, and what are you going to do about it? And I’ve had much tougher questions from children than I have had in most of the academic seminars that I’ve ever given.
[00:36:22.97] There’s an interesting initiative which I have a sort of loose association with. I was part of a workshop which was held in the horrible environment of Mauritius back in August over a week– and this is the Curtin Open Knowledge Initiative. And again, this is a project that’s led by Professor Cameron Neylon and various colleagues at the Curtin University in Western Australia.
[00:36:46.61] But it’s trying to reimagine universities as open knowledge institutes, and to get them to think as institutions about making themselves open. Not just making sure that what they publish is open access, but thinking about– well, you have a big library and library resources– is your library open to the public? Are people able to come and talk to your researchers? Now there are, of course, lots of challenges around enabling that, and I’m not saying that everybody should be doing it every day. But it is about thinking through how open access, and open science, and approach to opennesses should cause us to rethink how it is that we interact with society.
[00:37:26.33] And it has made me think more and more about the idea of openness in science– or open science– being about not just openness, but being about inclusion. And with my other hat on as assistant provost for equality, diversity, and inclusion, then open access and open science raises questions about who gets to participate. Not just in publishing, because we know that open access and the APC model is one that– it seems very expensive from the perspective of the global south, for example.
[00:37:55.35] And so there are residual issues of global inequity there that we still have to deal with. But it’s also, then, thinking about not just who gets to talk to researchers, but who gets to be a researcher. We have massive problems, still, with underrepresentation of women in the academy, not to mention ethnic minorities, and thinking about the experiences of people who are LGBT, or people with disabilities. So there are lots of different areas where we need to think about it.
[00:38:23.69] And as I’ve mentioned, I don’t think it’s any accident that the rise of open access has triggered a lot of thinking about research assessment reform, and there are a number of interesting developments in that area. DORA was one of the first to gain a bit of prominence. It actually started in about May 2013. The Leiden Manifesto set the ball rolling with regard to responsible research metrics, and that was a message that was very much taken up and reinforced by the Metric Tide report, which I was involved in.
[00:38:56.84] And then in May 2017, I was lucky enough to be involved in a project that was led by Professor Aileen Fyfe, who’s at St Andrews, who is a professor of the history of science, and studies scholarly communication over the centuries. And we co-wrote a report– which is open-access, and you can download it from that link there– which was looking at how it is that the original mission of many learned societies which ran their journals as a community service– in almost every single case at a loss– has changed since World War II, when many societies kind of learned the lessons of Pergamon under Robert Maxwell, and various other companies, that saw actually, there was really good money to be made in scholarly publishing, and now have sort of changed the approach to publishing so that it is a very important income generator.
[00:39:50.00] And that is, of course, as I’m sure many people in this room know, one of the real challenges of thinking about how you move from there to really supporting open access, which ultimately I think is at the core of the mission of many learned societies, since it is about disseminating information.
[00:40:07.89] So if you haven’t had a look at the report, I would like to recommend it to you. There’s just a couple of passages I want to highlight. There are a number of recommendations that we made at the end of it pointed at different stakeholders, government included, but also researchers and scholars– we would ask them to consider their responsibilities. Often in these debates, you will hear academics talk about academic freedom, and their freedom to publish where they choose, but they are less vocal– and I would like be more vocal– about their responsibilities, and particularly responsibilities as publicly funded men and women.
[00:40:40.47] So their responsibilities that sit alongside academic freedom and to reflect on whether they might re-prioritize the duty to communicate rapidly and widely in the face of the reputational credit that is earned through publication, which is, of course, the thing that obsesses us all. Learned societies– and I hope to get to this in the questions– “have a charitable educational and academic mission, and should consider the appropriate balance between their desire to generate unrestricted income from publishing for charitable activities– and internal activities– and the long-term consequences of allowing the publication of academic research to continue to be dominated by commercial models.”
[00:41:13.74] Those are challenges. There are a number of particular challenges for learned societies in the recommendations of the report. And it’s about– and perhaps we’re doing it today, so a big tick for the ALPSP– should facilitate discussion and greater awareness among members about the relation between academic prestige, the publishing industry, and the circulation of knowledge.
[00:41:37.05] Ideally, annual reports would explain the organization’s rationale for their pricing, and how this is justified by the organization’s mission– so have that conversation with your disciplinary communities. Reflect on whether the mission and business strategy of your co-publisher– if you’re a co-publisher with a commercial company– is a good fit for the scholarly mission.
[00:41:59.40] Disciplinary communities should embrace the opportunities for more rapid and widespread circulation research offered by preprint service, and hopefully many of you are encouraging and supporting that– and I think it’s getting to be nearly universal now with many publishers. And learning societies should be open to discussion with other societies and consider whether pooling resources could enable the creation of a low-cost, sustainable, online, not profit-driven model of academic publishing. So we do have to think about alternative models. There are no easy answers to this, and it’s very easy to stand on a podium and tell people what they should do. But I am here not just to do that, but to have a conversation about it.
[00:42:40.81] So DORA has very much emerged in this space to try and help facilitate conversations about research assessment, and to think about new ways of evaluation, and to create more space, in a way, for open access. It’s not really borne of the open access movement, but I think it was given a new urgency by the way that research assessment mores were interfering with modes of publication and allowing the retardation of the sharing of scholarly information.
[00:43:10.81] So DORA, if you haven’t heard of it before, is best known in the research community– it emerged from a group of scholarly societies and publishers in the molecular and cell biology fields, primarily focused on the American Society for Cell Biology and EMBO. And the declaration is best known for being down on the journal impact factor.
[00:43:33.79] And so the number one statement in the declaration is do not use journal metrics such as impact factors as a surrogate measure of the quality of individual research articles to assess an individual’s contributions in hiring, promotion, or funding decisions. And it is often known, therefore, rather negatively. So we are we don’t like the impact factor. We think there are lots of problems with it, and many people have spoken out against it, myself included.
[00:44:01.32] But what people tend to overlook is actually 17 other provisions within the declaration. It’s a fairly short document– it’s only about three sides if you printed out– and there are recommendations for different stakeholders, which includes institutions, for example, which have a key role to play, because often it’s institutions making key decisions about hiring and promoting people.
[00:44:22.34] And so it’s about being explicit and focusing on the scientific content of a paper, rather than publication metrics. And then thinking more broadly about research outputs and research impacts– data sets, software, but also, I would argue mentoring, contribution to teaching, contribution to the internal running of the department, and policy work, societal engagement.
[00:44:47.69] For publishers, there are a number of provisions, again– recommendations which we think are worthy of consideration. So reduce emphasis on the impact factor– and one way to do that is by presenting that metric in the context of other journal-based metrics, or even, as I would argue, and I’ll show you in a minute, citation distributions, just to detoxify that particular metric.
[00:45:15.26] Make available a range of article-level metrics, again, just to sort of re-draw attention to the content– assessment based on the content of an article, and not thinking about at the journal, because the range of citation performance within any one journal varies by two or three orders of magnitude. And even measuring citations is not in itself– that is a proxy for quality, but it is not a guarantor of quality. Highly-cited papers may be rubbish, lowly-cited papers may be very influential in a particular niche.
[00:45:48.86] Encourage responsible authorship practices in the provision of information about the specific contributions of each author– and that is a practical way of facilitating, then, people really be able to speak in their CVs about what it is exactly that they have done. So DORA is unique as an organization in that it is very much a campaigning organization that is trying to clear a way for thinking about better methods of research assessment.
[00:46:17.81] So we are encouraging people and organizations to sign up. We’re about six years old now. We relaunched a couple of years ago, because we were a little bit lost. And our search engine optimization was not the best, because if you search for DORA on Google, you get Dora the Explorer. And actually, I saw a bus yesterday that had an ad for a movie– so I think Dora the Explorer has now got a movie franchise. I wasn’t quick enough to get a picture of it, unfortunately.
[00:46:47.16] But we now have a much simpler URL, and hopefully are working on increasing our profile. Over 14,500 thousand individuals have signed, and 1,500 organizations. We now have an international steering group, which I’m pleased to be chair of, and we formed a global advisory board, because there’s no point in trying to reform research assessment just in the UK, or just in Europe. Science and scholarship are truly global endeavors.
[00:47:18.92] We operate in the republic of letter, shall we say, which recognizes no national borders. So it really has to be a global endeavor, and as we hear from our advisory board, which comes from all over the world, many of the problems that we face in this country are faced in Africa, South America, India, and China.
[00:47:40.76] So we have our roadmap– a set of priorities that are guiding our activities at the minute. Trying to raise awareness– that’s why I’m here today. I’ve been going around talking about DORA on many occasions this year. But we’re not here just to hector people and to tell people how evil the impact factor is, and how bad university rankings are, and how terrible they all are for believing in them, or sticking them on their journal information pages and whatnot. Because we realize there are good reasons for publishers and journals to do that, for example.
[00:48:11.77] But we’re very much about trying to enable routes to best practice, because the trouble is if you say don’t use the impact factor, people will say, well, what else am I going to do? Am I going to read the CVs and the papers of the 300 people who’ve applied for this professorship job that I’ve advertised?
[00:48:28.79] And we’re not saying that you should do that, but we are trying to surface good practice where it is already emerging in various universities and funders and learned societies around the world, and now increasingly trying to convene meetings where we can get people together to discuss this issue, and to try and come up with practical alternatives to do a really good job of research assessment in a way that recognizes the whole dimensionality of contributions that scholars and academics make.
[00:48:57.77] And we are aiming, also, in doing that, to extend the global and disciplinary impact. So here’s our steering committee. I won’t dwell on that– it’s on the website. But I do want to dwell on the advisory board. And we have an advisory board of about– what is that– 16 people. It’s chaired by Ginny Barbour, who’s in Australia, but we have– and we were very glad to be able, very quickly, to recruit representatives from all corners of the globe.
[00:49:24.14] So we have people from North and South America, from Africa, from Asia and the Indian subcontinent, and all over the world, and we can meet thanks to the internet and video conferencing technology. We’ve been meeting two times a year so far, but that’s actually going to go up to four times a year. And it’s been really, really important to actually hear the voices from parts of the world that too often don’t get a word in debates about scholarly communication and about open access, because these are parts of the world most of the world lives.
[00:49:57.12] And so it’s very much– I’m very pleased that DORA is trying to reorient itself to be a truly global organization, and one that is listening to and speaking for the world. Now I did talk a little bit about practical initiatives. We know that declarations are not enough. It’s not enough that an organization has signed DORA, as we saw with the ETH.
[00:50:18.86] The organization itself then has to commit to that and has to implement that, and that takes time, and that takes resources, but we are very much trying to enable people to do that. The American Declaration of Independence is very high-minded and noble declaration, but I don’t think the United States has yet achieved all its aims with regard to what, life, liberty, and the pursuit of happiness.
[00:50:45.65] We had the Finch report of course, famously, in 2012 in the UK, which made very powerful arguments for open access, but then had very interesting and controversial policy recommendations about going for gold. Carlos Moedas, who’s about to retire as European commissioner, has been very instrumental in committing Europe. And of course, Plan S is one of the fruits of that. And whether or not that is a sweet or a bitter fruit depends on your point of view, and is something that is going to exercise us for a little while longer.
[00:51:17.75] I think policies can be terribly helpful. This was controversial in some eyes, but the linkage of deposition of papers in institutional repositories in order to make them eligible for the ref has certainly got the attention of the UK research community. And so from population levels of about 20% to 30% in our institutional repository at Imperial, we are now, I think, approaching 90% in terms of deposition. And to my mind, this was a stroke of policy genius that does, in fact, get the work out there in a fairly frictionless way. And I want to pay tribute, actually, to the library staff and information handling staff at Imperial who have made this a relatively simple thing to do.
[00:52:03.02] Some of the work is technical, and some of it can be implemented by journals and by publishers. And this is a preprint that myself and a number of colleagues wrote a couple of years ago, and it was actually providing a recipe for publication with citation distribution. So we know in the real world, journals are never not going to publish their impact factor or to think about it, because they’re in the business of running a successful journal, and that is one of the things that their authors will want to know anyway, so why hide it from them?
[00:52:31.47] But we wanted to– boiling things down to a single number is one of the sort of fundamental problems of dealing with metrics, because you lose so much information and so much nuance when you are doing that. So we had a simple method, and it has been taken up by a number of journals– I would like to see more journals doing it. And it makes people think a little bit more about what lies behind the impact factor, and thinking that citations aren’t necessarily the be-all and end-all.
[00:52:56.99] The distribution that one sees have a characteristic shape for all journals. Most papers in most journals do not get the same number of citations that the impact factor would indicate to you, because you’re taking an arithmetic mean of a non-normal distribution. So it is, to coin a phrase, statistically illiterate.
[00:53:16.06] I was pleased a couple of years later that Clarivate now actually calculates these citation distributions, and will actually provide them to any journal, and will permit you to print or publish their version of the citation distribution on your general information page. So if you didn’t know about this already– and there is some interesting work going on at Clarivate about thinking about the uses of different metrics– there’s a problem, also, with the h-index which I won’t go into.
[00:53:44.92] But it’s very much similar to the JIF, because it just boils an individual down to a single number, and you lose so much important information when you do that. And then even thinking about university rankings, they’re thinking about a research footprint– look at the different aspects of what a university does. But of course, what absolutely all but one of the current ranking systems does is end up with an aggregate score, and that aggregate score is deeply, deeply problematic.
[00:54:12.90] And to me, there’s an intellectual dishonesty in the construction of league tables that I think ranking organizations simply have not embraced. They know about it, because I’ve talked to some of them about it, but it makes too good a story, and it’s nice to have a league table, even if it isn’t defensible or truly meaningful.
[00:54:35.45] There was another group that I was involved in recently– a project, again– a range of stakeholders, publishers, some bibliometricians, also some of the data-providing organizations, members of Clarivate and Elsevier were there– thinking about– and again, it’s about trying to distract attention away from the impact factor, and to think about the provision of other metrics that actually characterize the important things that journals do.
[00:54:58.97] Registering a piece of work– who did it, who gets the credit for it? Curating it, bringing together interesting papers from across a discipline, maybe highlighting new breaking findings that might create a new niche. Evaluating it– running peer review, and doing that in a timely fashion if possible, and having a metric to indicate how many reviewers you get typically, how long it typically takes to get a paper to review.
[00:55:24.35] These are sorts of useful information on journal metrics that would-be authors would be interested in. How good are you at disseminating it to the audiences that really need to get it, and what’s the commitment to open access and archiving? And I’ll leave you to read that in your own time.
[00:55:44.58] And also, in terms of then research evaluation, one of things that DORA is doing, as I said, is trying to surface good practice. And there’s an interesting piece of work done at Utrecht University and was led by Frank [INAUDIBLE], where instead of just using the usual CV and– look at the impact factors, and look at the list of publications, and how much grant income do you get– they now ask people who apply for jobs and people who are applying for promotion to write a short essay, in a way, where they have to speak to five different points.
[00:56:14.90] They have to talk about their research, publications, and grants– OK, this is the real world– but also talk about their managerial and academic duties. How good a citizen are they at the University, their contribution to mentoring and teaching, their clinical work– because they’re at a university medical center– and then entrepreneurship and community outreach. And again, this is an attempt to capture in a succinct form and a concise form– so this is a type of biosketch. And this is a tool that I think should be increasingly used, because it allows for a more discursive but still concise presentation of the qualities of researchers that we think should be important. And of course, each institution could tailor this to their own needs.
[00:56:58.10] And DORA is increasingly trying to help both disseminate these good practices– we have a list of them on our website– but also we’re now convening meetings looking at tools for eliminating bias in research assessment. And we’re having a meeting in the HHMI– this is co-organized with HHMI in Washington next month– to look at driving institutional change.
[00:57:24.17] And one of the things we want to try and do is actually get more US universities sign up to DORA. Europe’s been quite good at signing it on an institutional level, but in the United States, they are relatively few, and we want to explore why that is and what we can do to facilitate it. So we are trying to do the work that is providing practical solutions in order to deliver on the values that we think are important, both in terms of openness and in terms of societal impact.
[00:57:52.67] For the future, in Europe, assuming that we are still part of Europe– it changes by the minute– and depending on the Supreme Court next week, who knows what will happen– but Europe has been terribly influential in this. And of course, under Commissioner Moedas, there really has been a push towards open science– not just because it’s a duty to the public, but because– I’ve sort of made this point earlier– openness is really about protecting the integrity of the scientific record as well, and about building trust in the scientific and research endeavor, and about trying to find new ways to involve the public in that.
[00:58:32.07] And I think that’s healthy and necessary for the maintenance of a democracy. Whether or not we still have a democracy in this country is something we can deal with, maybe, in the questions. The funders are obviously very important in this. There was an interesting analysis of the future of scholarly publishing– again, published under the auspices of the European Commission.
[00:58:52.55] But there was an interesting line in this which noted that out of all the stakeholders– academics, universities, journals– funders are the only stakeholder who are not ranked in some way and not subject to this kind of brutal evaluation. And that gives them the freedom to act, and they have taken the freedom to act in the shape of Plan S, again, which has been under discussion for a year– is coming down the line in 2021. I’m not even sure that everybody at Imperial College is quite aware of the implications of it.
[00:59:26.90] So at the minute, you cannot publish in Nature if you are funded by a Plan S or a Coalition S funder. But it is deliberately provocative. I think there’s a lot of good things to it. I think some of the communication about what they were trying to do and how they’re aiming to go about it could have been a lot better, because of course, there was a lot of pushback from a lot of quarters. And a lot of that is genuine, and there are genuine concerns that we do need to address.
[00:59:54.45] But one of the things I was pleased to see in it is that– and I think it is an absolutely necessary thing, if we are going to do down this road– it is about clearing a space and rethinking how it is that we do research evaluation. So DORA was not involved at all in any of these discussions beforehand, but I was pleased to see that they are using DORA as an example of good practice.
[01:00:18.59] And what is it that we mean by good practice? What is it that we mean by success, to come back in the end to Victor Frankl’s quote. So this is my recipe, or how I would see success in research and in researchers. I would like to see reliable, rapidly communicated, accessible, high-quality research that transforms our understanding the world and can change it for the better.
[01:00:43.76] I would like to foster and support researchers who collaborate, who feel a duty of care to their group members and colleagues, and a responsibility to the societies in which they are a part. And a research system that values the people within it– that’s the thing that I think we have lost sight of.
[01:00:58.34] And it is easy to lose sight of it in the everyday pressures and stresses that we are all subject to– that cares about their quality of life, thinking about my children– and that seeks out the creative vigor of diversity. And that’s something that still remains and a challenge and a struggle for us, but many institutions and funders are now really thinking about that. And I think we all have to think holistically about how we would go about that.
[01:01:26.09] So how do we get there? OK, that vision is a kind of utopia. But fortunately, I read a book called Utopia for Realists, which I would recommend to you– it’s quite a provocative and interesting read. “Governing by numbers is the last resort of a country that no longer knows what it wants,” and maybe there’s a message for all of us there. “If we want to change the world, we need to be unrealistic, unreasonable, and impossible.”
[01:01:47.57] Because people used to say it was impossible to live without slaves in one society, or it would be completely ridiculous to give women the vote. And we know that actually, those things were quite possible and wholly reasonable. But of course, we don’t want always to be unreasonable and unrealistic in this debate. I think a lot of debates about open access and open science have got overly overheated, shall we say. And so it’s very much, I think, a matter of trying to make sure that all stakeholders– everybody who’s got skin in the game, and everybody who’s got a common set of values and wants to make the world a better place through their contribution to it, whether you’re an academic, whether you’re a publisher, whether you’re a funder– that we think through this problem together.
[01:02:32.03] There’s an interesting essay on utopias written by Adam Gopnik from The New Yorker. And he key message is that there’s no single rule that is applicable to all cases. There’s no silver bullet to open access or research assessment. There are probably lots of different solutions that will work with different communities and in different situations, and I think we have to be open to exploring them, and to acknowledge the complexity, and to acknowledge that there are arguments about it. And that even if we make steps in the right direction, we still won’t quite be where we want to be, and we may even have caused a new problem that we will then have to deal with. But let’s do that journey together.
[01:03:13.47] And again, this is one of the few journals that I subscribe to, actually, out of my own pocket, is the New Yorker. Atul Gawande, who’s a wonderful thoracic surgeon– thinks about public health– again, wrote a beautiful essay about slow ideas– about how you convince people to change the way that they do things when they are a little bit stuck in their ways. “We yearn for frictionless technological solutions”– like the impact factor, for example– “but people talking to people is still how the world’s standards change.”
[01:03:41.94] And that is a message that I have never forgotten since I read it. And in my last slide, I recommend to you Margaret Heffernan, who thinks very deeply about how organizations work and how they work well together. And there is going to be conflict when we have debates about open access, when we have debates about, well, where is the funding going to come from for Plan S, and things like that.
[01:04:06.40] But there is going to be conflict there, and we have to be ready for that. And we have to have the patience and the energy to engage with that. And she says– and it’s really quite telling– the more I thought about this, the more I think, really, that it’s a kind of love that one needs to express that commitment. You simply won’t commit the time and energy if you really don’t care. I care, and that’s why I’m chair of DORA, and I’m sure that everybody in this room also cares. But it is going to take energy, it is going to take time, it is going to take reaching out and having understanding. And I hope that that is something that we can all do together as we go forward. Thank you very much.
[01:04:42.33] [APPLAUSE] [01:04:51.47] OK, I haven’t over-run, but I haven’t finished quite as soon as I hoped to. But I think we have six or seven minutes left, so I am very happy to take questions or criticisms from the floor. If you put your hand up and wave, there’s a roving microphone. Is that a hand right at the back there?
[01:05:12.12] AUDIENCE: Hi. Hello. It’s [INAUDIBLE] here from the international bunch. I’m hoping I’m not going to cough, so bear with me if I do, but I was wondering what level of interest and commitment you’ve had from institutions in China, as there are examples where they expect their researchers to publish in journals with specific impact factors.
[01:05:34.74] STEPHEN CURRY: Yes, we do have interest from China, and one of the members of our advisory board is actually a Chinese scholar and a member of the Chinese Academy of Sciences, and actually, I think helps direct an institute which is looking at research assessment. Yes, there is this issue in China, and in some other countries where there are very substantial material rewards for publishing in the right journals.
[01:05:56.19] And the Chinese get criticized for this, but I would also suggest to them that actually, the Chinese in a way are just being more upfront about something that we do anyway. There may not be a direct reward– although I think, probably, some institutions will offer people bonuses for getting– in this country– for getting papers in the right journals. We know that’s how you get promoted, and we know that that’s how you probably get grant funding in the future. And so ours is a little bit more indirect, and they are being a little bit more direct about it. But it is a problem, but it’s part of a whole problem. I mean, the Chinese are perhaps playing a game a little bit more explicitly, but it wasn’t a game that they invented.
[01:06:33.15] And that’s why we do need this global cooperation to make sure that we try and pull away from that, because that will create perverse incentives that will drive people to do what they can to get into Nature, which is about having a streamlined story in some respects. And therefore there is an enormous temptation to omit data or to finesse the story in a way that is more friendly to the sort of high-impact journals than being upfront about it.
[01:07:04.02] And I think it’s evaluation that is looking at rigor, and I think there should be severer penalties for if you get caught out and if your work isn’t reproducible– we need to be thinking about how we do that. The lights are really– do I see any more hands? Chap the blue shirt at the front here.
[01:07:31.92] AUDIENCE: Thanks.
[01:07:34.49] STEPHEN CURRY: Oh, sorry, was there somebody–
[01:07:37.08] AUDIENCE: I would just ask you to elaborate a little bit on the connection between open access and reliability of science, because you were talking about the kind of rapid accessibility, but then on the other hand, a kind of reliability of science, too. And then of course, if you think about your Zika example, we want to have what we know out as fast as possible, but we don’t want to have false research results.
[01:08:03.33] So there’s a certain tension between actually having reliable research results and having those reliable results rapidly– and open access. So what do you think kind of about the relationship between open access and reliability?
[01:08:21.38] STEPHEN CURRY: It’s an interesting relationship. I’m not deeply troubled by it. And so I’m very much in favor, as I think I said, of people using preprints, because I think most reasonable scientists are doing their best. And certainly I take great care, and actually, once I started using preprints I took even greater care of my manuscripts before posting them.
[01:08:40.52] Because previously, if you just submitted to the journal– I knew if there were a few silly errors in there, it would be between me and the referees and the editor, and nobody else would ever find out. But when you publish a preprint, it’s out there, and it’s on you. If there’s mistakes in there, it’s on you. And when it’s published as a preprint, it says at the very top– this is a preprint, it’s not peer reviewed. Use at your own risk, as it were.
[01:09:02.21] Well, it doesn’t say that, but that’s the message. And it allows– by opening yourself up to a worldwide audience for interrogation, they can come in and tear that piece of work apart, and you’ve made it easy for them to do that– then that is a system that I think is better at self-correcting, which is of course one of the things that we tell the public is what science does in spades. We self-correct all the time, and yet we still seem to publish a lot of unreliable material.
[01:09:31.52] And there is data suggesting that papers published in top-tier journals– often studies are not always as statistically well-powered as they should be. And so if you look at more disciplinary journals, there tends to be a greater emphasis on rigor, but the novelty is what gets you into those top-tier journals. And sometimes you get novelty because you have an outlier result, because you simply didn’t have enough samples or enough subjects in your study. And eventually, when it’s looked into in greater detail, you get regression to the mean.
[01:10:02.30] So there are problems, I think, and certainly with the APC model then– I don’t regard it as vanity publishing, because pretty much all scholarly publishing is vanity publishing, let’s be honest. But if there is a conflict there, and if there isn’t a good Chinese wall, shall we say, between the editorial and then the payment process– which I believe there is in the very best publishers– then there’s a potential conflict there.
[01:10:33.00] But I think the openness and the transparency– and I think open peer review is also part of that as well. There’s a very interesting journal called Atmospheric Chemistry and Physics, which is an open-access journal which started about 10 years ago, and it’s now one of the most prestigious journals in the field. And it’s because they have an open, fully-transparent peer review process– it’s a bit like [INAUDIBLE] research.
[01:10:56.87] The manuscript is published, anybody can comment, it’s all open, and when referees are called, and the reviews are published. And they accept above 80% of the manuscripts that are submitted, and yet they have a very respectable impact factor and the field, if one is comparing with other journals. And the editor of it suggests that because they’re so open, the authors self-select the papers that they submit to that, because they know they’re going to get scrutinized to the nth degree, and that they won’t necessarily be able to just fool one or two referees. So I think openness and integrity– they do synergize if the system is set up properly. Question here?
[01:11:38.42] AUDIENCE: Hi, I’m Richard [INAUDIBLE] for Sage. You advocate a lot for preprints, but I’m sure every researcher has got enough time to read every single article, and analyze it in the way that you describe. So do you not see there’s a place for curation and selection in journals?
[01:11:52.79] STEPHEN CURRY: Oh yes, absolutely. My utopia would be everything as a preprint, and then the journals come along and they pick out what they think is interesting. Or you get community disciplinary communities, and they publish overlay journals that maybe synthesize two or three papers– oh look, there’s an emerging finding in this field that looks really significant, really exciting. This is what they’ve done.
[01:12:17.75] So I would encourage societies and other publishers maybe to think about running journals on that kind of a model. I don’t know the mechanics of how exactly you would make it work, but I do believe– and I know that some journals, of course, now they scan the bioRxiv in the morning as well. And then they write to the author and say, we’d love to receive your manuscript. And the author is usually delighted to have the invitation, because that’s not often how it works.
[01:12:42.53] Yes I think there is an important role for certain– and that’s a very important role for learned society publishers, because clearly they’re often looking to promote their own discipline and to make sure that the interesting work that’s coming out of that is disseminated to those audiences. So yes, and I think that’s one of the things that I think the open access mega journals– which I haven’t really talked about in any detail though– was a very interesting important innovation.
[01:13:09.84] But I think one of the things that they have struggled with a bit is to actually have a disciplinary relationship with their authors. And I know some of them have been experimenting with trying to create that post facto. So one hand here, and then there’s one there. Katrina– wait for the microphone.
[01:13:33.58] AUDIENCE: You’re quite rare, Stephen, in terms of the fact that you’re a researcher and championing open access and open science. And to what extent do you think more researchers can take on this mantle? Because your argument is very powerful and persuasive– certainly to me, and I know I’m biased– but it seems that most researchers are trapped within the system, and are actually complaining, for example, about Plan S and things, in terms of losing their academic freedom. How do you respond to your colleagues, both in the sciences and in the humanities, about this?
[01:14:14.91] STEPHEN CURRY: Well in the same way that I wouldn’t expect every researcher to be involved in public engagement, for example– I think there’s great value in public engagement, and it’s something that should be recognized and rewarded within universities and elsewhere. I had a particular passion for it which kind of grew out of public– I’m here because I started writing a blog in 2008– that’s basically fundamental reason. To me, I thought that was a relatively easy way to discharge my duty to do some public engagement work.
[01:14:42.81] But I find that it really started to make me think about science in a way that I hadn’t really very seriously, as I was busy working very hard in order to forge my career to get up the greasy pole to Professor. I do see this work now– I mean it’s a little bit extracurricular, but I do put it down. I’m not hiding it from my university or from my line manager. I think it’s one of the things that is part of my academic citizenship. That’s how I see what it is that I’m doing.
[01:15:13.35] There are plenty of other people who are working academics who are in this space. Mike Eisen has just taken on as editor of eLife, and Mike is a very passionate pro-open access– he’s far more passionate than I am, so you should invite him next year, actually. He’d be great value for money. So you don’t need everybody to do it.
[01:15:36.72] And it is a conversation that not everybody is going to be interested in. There’s a lot of people– they just want to get on and do the work. The science is the be-all and end-all of their lives. But there’s a rich range of people with different levels of interest.
[01:15:52.11] And I would like to get more people interested. I will still come across people at Imperial who have never heard of Plan S and don’t know what’s happening, and the first time they’ll hear it is when someone tells them, I’m sorry, but you can’t submit to that journal because that’s not allowed– assuming that that’s how things pan out in the next 12 months or so. Question here?
[01:16:12.35] AUDIENCE: Ed Gerstner, Sprinter Nature. Do we need means of assessment that encourage diversity? It seems to me that if you want to increase the representation of privileged white men, you ask them to write a essay about themselves. So that particular suggestion concerns me on that. Do we need specific approaches to assessment that drive diversity?
[01:16:39.78] STEPHEN CURRY: I think one needs to be aware of the potential of bias that’s in there. So we know that women are held to a higher standard. We know that fewer women in academics, for example, are asked to review, and there are statistics looking at authorships. And at the minute, currently, I think first authorships more or less tracks with the demographic that’s in the research population, but senior authorships are still some way behind, and so there are issues there.
[01:17:05.89] I think one of things that would be useful for publishers is that they maybe publish the stats on the genders of their authors, and of their reviewers, and their success in recruiting them. And I know that some journals have done that. And again, it’s about– one of the things that you have to do to try and encourage certainly gender equality is to go out and be more proactive encouraging women researchers to take on these tasks.
[01:17:32.88] Or Nature, for example, has done a study and the commentary pieces that are published in Nature– that’s still a big deficit. And I know they’ve tried really hard to do it, and the numbers are changing in the right direction, but I think they’re still a little bit disappointed in how far they’ve got.
[01:17:48.03] With ethnicity and race it’s much harder, because you don’t have the information. You can usually figure out gender from the name, but you can’t do that easily with ethnicity. And so I’m a bit wary of introducing yet another whole tier of it. Certainly in the UK we engage with lots of benchmarking teams that are trying to drive equality, diversity, and inclusion. So we have Athena SWAN, we have the Race Equality Charter, we have the Stonewall Workplace Equality Index, and schemes– disability, confident employee schemes, and things like that.
[01:18:23.13] And the monitoring is important, and again, it’s an important source of quantitative information. The problem that we have with a lot of those schemes– the scheme then– a bit like the RAF– it spirals out of control and becomes, to a degree, an end in itself. And we forget that the proper end is a research culture that is just truly inclusive– that treats everybody the same, and is looking for what everybody– and supporting what they can contribute, and recognizes that if you’ve got a different background to me, whatever it is, you’ve got a different take on things.
[01:18:55.44] And we know from business and, I think increasingly in research culture, that if you are going to encourage diversity you will win, because you will have better ideas, better discussions, and you’ll be doing research that is of more interest to more people because you’re bringing in lots of different perspectives. Margaret Heffernan’s book A Bigger Prize is really good on these sorts of issues.
[01:19:19.48] I’m pretty sure I’m over time now, so I see Sarah–
[01:19:22.96] AUDIENCE: You can take one more question.
[01:19:24.50] STEPHEN CURRY: OK. Where is it? Oh, yes, I see. Yes, hello.
[01:19:37.78] AUDIENCE: Mark Carden from Mosaic. I work in recruitment, and there’s a lot of movement there to remove institutional wrappers from people– you judge the person not where they were undergraduate, or even the name of the company they work at the moment– which makes it quite hard.
[01:19:52.84] You talk about removing the branding of the journal as a sort of criterion– you’ve not said a lot about institutional branding. Should I discount, I don’t know, an article written by a professor at Imperial College compared to an article written by a more junior person at a less prestigious institution as being– the article is the important thing, not the container?
[01:20:16.24] STEPHEN CURRY: Yes, absolutely you should, because you should judge the work on its own merits. Not everybody at Imperial is a genius– let me confirm that to you right now. And there are people and yes there is a whole and I think there is work to show that top tier universities tend to hire– recruit from each other.
[01:20:36.95] And again, that is a result of a system of evaluation that does look at these brands and signatures of prestige– some of which are journals, some of which are university names. And we’re going to have another ranking published later today which does that. But again, we know from actually looking at the demographics of our students and our staff that there are people who have got privileges who get into these places more easily. They’ve got the capital, they’ve got the experience, they’ve got the funds behind them economically that makes their path easier than for the minorities that we see that are clearly underrepresented in the academy.
[01:21:15.28] And that’s not just women. Clearly there’s a huge problem, still, with gender equality. And the assumptions that people make about the capabilities of women, the ability of women, or the ambition of women in order to forge a career– because there are questions asked if they want to have a family, then oh, well, are they really devoted to an academic career? And these are the sorts of institutional barriers that still exist for them.
[01:21:39.68] And so we really do have to get to a point where we are looking at what people have done, and we judge them on their contributions, and not think about the shorthand– and the shorthands are very powerful because they make life easy, and we’re all– God help us– ferociously busy, and so we’re all always looking for shortcuts.
[01:22:00.85] And so I think one of the solutions– I don’t know how you operationalize it fully, but I think anonymization as much as possible– remove journal names from the CV publication list, remove institutional names from where you got your degree, and really judge people on themselves. Because there’s a huge range of performance behaviors, even at top-tier universities. And you will get brilliant people at universities that you may never have heard of it. It happens.
[01:22:28.39] And you can’t just use the laws of average to do these things. You have to look at people and treat each individual on their own merits.
[01:22:45.20] SARAH FAULDER: So I don’t see Wayne in the room, so I think it falls to me to ask you all to thank Stephen for a really fascinating talk and the questions he should include.
[01:22:56.25] [APPLAUSE] [01:23:01.22] And I know there are lots of questions still out there. We do have a networking break now, so hopefully Stephen will make himself available to continue the discussions.
[01:23:11.27] STEPHEN CURRY: I’m around for the rest of the afternoon, yes.
[01:23:12.71] SARAH FAULDER: I think we reconvene at 3:15. So thank you all very much.