Reimagining Trust in Science

During National Science Week 2020 the History and Philosophy of Science program (HPS) hosted two events as part of the University of Melbourne’s Science Festival. The first event was a panel discussion on ‘Reimagining Trust in Science’, the second an interactive workshop showing how the repliCATS platform is being used to assess the reliability of COVID-19 research. HPS Graduate student Samara Greenwood reviews the events.

It is impossible to be a student in History and Philosophy of Science and not be aware of issues of trust in science. These issues range from the ‘replicability crisis’, involving specific concerns about the reliability of published research results, to a general decline of public trust in science over recent years. As my formal studies have only briefly touched on these topics, I was interested to learn more about them through the HPS events held during National Science Week.

Panel Discussion on ‘Reimagining Trust in Science’

In the panel discussion ‘Reimagining Trust in Science’ three leading researchers presented their current work on this topic.

The first panellist, Professor Rachel Ankeny, is an interdisciplinary scholar at the University of Adelaide. Her work examines the relationship between the practice of science and the social values of society.

Rachel noted public distrust of science has a strong emotional element. Many people no longer believe they can ‘just trust science’ as scientists do not always appears to work for the benefit of all. This disquiet is compounded when experts respond to such concerns by treating the public as merely uninformed. Through group workshops, Rachel has found a ‘deliberative democratic model’ is a productive way for scientists and non-scientists to communicate. Instead of a one-way flow of information, scientists and non-scientists engage in a co-operative manner to understand a range of perspectives and, ultimately, ensure the good of the whole.

Slide from Professor Fiona Fidler’s presentation on ‘Reimagining Trust in Science’

The second panellist was Professor Fiona Fidler of the University of Melbourne who also presented at the Science Faculty’s Dean’s Lecture on ‘Trusting Science in a Time of Crisis’. Fiona is lead researcher on the repliCATS project and co-founder of MetaMelb Research Group, both of which focus on improving scientific practice.

Fiona agreed that today’s non-scientific public often feels alienated from science and that known problems, such as issues of publication bias, feed into this unease. She also noted many scientists call for a refrain from public critique of science as this can provide ammunition to those with a vested interest in undermining science.

The question then is, what is the best way forward? Fiona’s response is to highlight the work taking place to fix these issues. As Fiona noted, science has resolved similar issues in the past, but such corrections didn’t occur by magic. As she succinctly put it, “self-correction requires investment”.  Going underground about uncertain practices is not the answer. Rather, science needs to clearly demonstrate the way in which it is restructuring itself to resolve these issues.

The final panellist was Professor Simine Vazire from the University of Melbourne. Simine is a professor of psychology and co-founder of MetaMelb Research Group, whose research interests include assessing the quality of scientific studies and the peer review process.

In her presentation, Simine made the important point that society depends on science to provide reliable knowledge, thus science has a responsibility to ‘live up’ to that dependence. She identified two crucial components of credibility: transparency and critical appraisal. Transparency helps establish credibility by letting others ‘look under the hood’ of science, with the Open Science movement a clear example of this way of working.

However, Simine noted that transparency is not enough. Open data must also be subject to critical appraisal in order to identify errors that may initially be overlooked. The testing, critiquing and correction of scientific practice then helps ensure the long-term credibility of scientific knowledge.

All three presentations generated lively questions and comments from the audience. It was clear these issues struck a chord, and the presentations were valuable in not only pointing out problems but providing a range of possible solutions.

RepliCATS Workshop: Assessing the Reliability of COVID-19 Research

At this repliCATS Workshop, the challenges around trust in science were addressed in a very practical and hands-on way. During the workshop Fiona Fidler and fellow repliCATS researcher, Dr Martin Bush, outlined the background to their work before taking us through the process of assessing the replicability of a recent research paper on COVID-19.

The repliCATS project utilises a method, the IDEA protocol, developed at the University of Melbourne. In this protocol, diverse volunteers assess the likelihood a particular research result is replicable (i.e., the same result would be achieved if the same research process was repeated by others).

Using the repliCATS platform, an assessor first reviews a specific piece of research before providing an initial prediction of replicability. The assessor then discusses their reasoning with others. This enables personal reflection and exposure to other viewpoints. In light of this discussion, assessors then provide a second prediction.  In this way, the final outcome benefits from a diversity of assessors, a broad range of judgements as well as both group deliberation and individual judgement calls.

Slide depicting the repliCATs IDEA protocol (Investigation, Discussion, Estimation, Aggregation)

One workshop attendee raised an interesting question: who are the best predictors? While the repliCATS research is not yet complete, the team has found that expertise in a narrow research area does not necessarily correlate with accurate predictions. Rather, what seems most useful is a broad familiarity with the general methods and models used across different forms of research.

All up, I found the two events worked well together, providing different perspectives on similar concerns. I came away feeling better informed and encouraged by the dedicated work underway to ensure such critical issues are being investigated and addressed.