Privacy Policy GDPR
0 0

No products in the basket.

Experts, Disagreement, and Trust

When politics invites itself into your research

Prof. Maria Baghramian
Member of the Royal Irish Academy, School of Philosophy, UCD

The research project When Experts Disagree, funded by the Irish Research Council, was an attempt to resolve the complexities of peer expert disagreement. During its work, the philosophical discourse on expertise changed, with the very standing of experts undermined in an age of political anxiety and suspicion. This article explores the topic of trust in experts and the role of philosophy in defending against the dark clouds of irrationality and extremism.


Disagreement among individuals or social groups, a common feature of our daily lives, is troublesome not just because of its impact on our personal and sociopolitical relationships but also for the philosophical dilemmas it creates. One longstanding challenge, discussed by philosophers since the Sophist Protagoras (490–c.420 bc), is how to understand and deal with persistent disagreements, particularly in the normative domains of ethics, aesthetics, and matters of taste, which, despite centuries of debate, do not seem amenable to rational resolution. The ancient but enduring philosophical doctrines of relativism and scepticism are among the best-known attempts to resolve or dissolve this so-called problem of intractable disagreement.1

Other philosophical difficulties arise when dealing with disagreements between experts. One puzzle is to explain how two or more experts could share similar levels of training and experience, be (to all appearances) equally rational, honest, intelligent, and thoughtful, have access to the same data and evidence, and yet arrive at contradictory conclusions. The two corollaries of this puzzle are: How should the beneficiaries of the expertise – policymakers, governments, ordinary citizens – choose between the conflicting expert opinions and advice? And what stance should an expert take when facing a peer who rejects their view? Should they entertain doubts and reduce the level of credence they have been placing on a set of evidence or point of view, or should they remain steadfast and try to show that the other party is wrong?

The question, as the psychologist Robert Hoffman puts it simply but effectively, is: ‘If the “experts” are experts, why do they disagree? And since they do disagree, how can we [and even they themselves, I would add] rely on their judgments and advice?’2

New perspectives for a national project

The research project When Experts Disagree (WEXD) (2015–2017), funded by the Irish Research Council’s New Horizon scheme, was an attempt to come to terms with the complexities of peer expert disagreement. Co-directed by astrophysicist Luke Drury (Dublin Institute for Advanced Studies), the project compared cases of expert scientific disagreement in astrophysics, a field relatively free of economic and political pressures, with cases in the politically charged area of climate change. The comparison, we argued, could lay bare some of the essential methodological and normative differences in the treatment of disagreement in the two arenas.3

Within a few months of the launch of WEXD, the political world imposed a new perspective on our enquiry. In April 2016, Donald Trump declared his official hostility to experts: ‘They say, “Oh, Trump doesn’t have experts,”’ he told a crowd of his avid supporters. “You know, I’ve always wanted to say this. … The experts are terrible.”’4 Michael Gove followed suit: ‘I think the people in this country have had enough of experts, with organisations from acronyms saying that they know what is best and getting it consistently wrong. … Because these people are the same ones who got consistently wrong what was happening.’5 The concern occasioned by these remarks was not so much about a few politicians posturing just before a crucial vote but the sense that they are tapping into a new sociopolitical zeitgeist, a concern that was further confirmed by the Brexit and US presidential election results.

Researchers increasingly are asked to make their work ‘relevant’, to prove its value for the industry, society, or politics. This demand is not always easily met by philosophers working on epistemological and metaphysical questions, or indeed by astrophysicists working on cosmic rays emanating from the deep recesses of the universe. But the traffic between research on abstract topics and those relevant to the messier ‘real world’ is not always unidirectional. The real world can invite itself, if not indeed force its way, into even the most abstract research agendas and thus open new and interesting horizons of investigation.

The recent development and popularity of ‘applied epistemology’ is one such example. A significant number of established philosophers, specialists in traditional epistemology, metaphysics, and philosophy of language, have turned their attention to politically pressing issues of the day: the discourse of post-truth, conspiracy theories, and the politics of knowledge.6 What binds us all is a common feeling that extraordinary times require out-of-the ordinary intellectual responses, and philosophy can have a role in constructing some defence lines against the threatening dark clouds of irrationality, extremism, arrogance, and bigotry.

An international consortium

With the new political climate, it became obvious that the philosophical discourse on expertise was also changing. The urgent challenges now were not about the dilemma of the disagreeing experts, but the more troubling question of the very standing of experts and expertise in an age of political anxiety and suspicion. The experts, this vanguard of the ‘intellectual elites’, were deemed as suspect as other reviled elites: the bankers, politicians, and media. The politically motivated question now was why, and whether, we should trust the experts, rely on their advice, or allow their presence in public or private decision-making.

As a result, the final academic event of WEXD was an international conference on Trust in Experts, an event organised in part to signal the new direction that our original research was going to take. By that stage, trust in experts had become the topic of the day, and Luke Drury and I were invited to join a British Academy and All European Academies (ALLEA) working group on Trust, Truth, and Expertise (TTE) co-chaired by Baroness Onora O’Neill, a publicly engaged philosopher who had been writing on trustworthiness for decades.7

The Working Group, in the course of four workshops over 15 months, produced three working papers on issues we deemed fundamental to the topic of trust in experts.8 But it was clear that we were only scratching the surface of the problem and that there is much more to be done. The discussions of the TTE working group, and the results obtained by WEXD, became the springboard of a successful application to the Horizon 2020 European funding scheme for a new project on trust and expertise, entitled PEriTiA (Policy, Expertise, and Trust in Action).

The condition of trust

Technologically advanced societies increasingly rely on knowledge-based or knowledge-driven forms of governance, where policy decisions and legislation are reliant on advice and data from various sources of expertise. Key policy decisions, ranging from food safety to climate change, lifestyle to healthcare, economic planning to education, are guided by data, evidence, and advice from experts in the relevant fields.

In democratic systems of government, where consent by citizens is a requirement of good governance, trust in experts and their advice is a requirement for achieving a workable triangulation between expert opinion, governance, and citizens’ consent. For instance, no amount of sound advice and strong evidence would be enough to implement a mass vaccination policy or a particular piece of dietary advice, on a voluntary basis, without prior judgement of the trustworthiness of the sources and the quality of advice given by the experts and transmitted by the relevant health authorities.

More pressing still, democratic governments will find it difficult, if not impossible, to legislate on policies directly informed or even shaped by expert advice, however well-intentioned, that encounters distrust by those affected by the policies. It is a truism that if citizens trust governmental policies and authorities, then they are more likely to comply with their directives. But trust is a commendable stance only when it is warranted, when the objects of our trust are genuinely trustworthy.

Epistemic vigilance is one way to ensure that the leap of faith involved in trust is not confused with a thoughtless dive into the complete unknown.9 However, the nature and the correct measure of social trust, and the requirements of epistemic vigilance, particularly in the changing landscape of social media, instant communication, and big data, are not well understood.

New directions in the study of trust in experts

PEriTiA conducts in-depth multidisciplinary research on the topic of trust in policy-related advice from scientific experts. By focusing on the type of trust required to create legitimacy for informed, evidence-based policy decisions on complex issues, it aims to shed light on the alleged breakdown of trust in various facets of public governance – what is often called a ‘crisis of trust’. We also aim to clear some of the conceptual confusion on the notion of trust and trustworthiness.

Our chief research hypothesis, to be investigated in various theoretical and empirical ways, is that epistemic public trust, contrary to some established views, should not be confused with mere reliance. There is always an element of risk or a leap of faith involved in trust, and with it comes the feeling of betrayal that broken trust entails. Our study will try to show that trust in people or organisations not only requires epistemic vigilance but also has significant affective and normative dimensions that have a decisive impact on judgements of trustworthiness.

The investigation is carried out in three phases – theoretical, empirical, and ameliorative – and relies on the work of philosophers, behavioural economists, sociologists, policy experts, and media specialists from UCD, ALLEA, University of Oslo in Norway, CNRS in France, University of Utrecht in the Netherlands, King’s College London, University Vita-Salute San Raffaele in Italy, the American University of Armenia, the Polish Academy of Sciences, campaigning charity Sense About Science, and SME Strane Innovation. The project also benefits from the direct involvement of a distinguished advisory board that includes Onora O’Neill, Susan Owens, Cass Sunstein, David Farrell, and Dan Sperber. The results of the research will be published online and in scholarly journals and books over the next three years.


  1. For the connections between disagreement and relativism and scepticism, see Baghramian, M. and Coliva, A. (2019) Relativism: New Problems of Philosophy. London and New York: Routledge.
  2. Hoffman, R. (1996) ‘How can expertise be defined? Implications of research from cognitive psychology’, in: R. Williams, W. Faulkner, and J. Fleck (eds.) Exploring Expertise, pp. 81–100. Edinburgh: University of Edinburgh Press.
  3. Selected publications of the IRC project When Experts Disagree (Project ID: REPRO/2015/89) include:
    1. Baghramian, M. (2019) From Trust to Trustworthiness. London: Taylor and Francis.
    2. Baghramian, M. and Martini, C. (2018, 2019, 2020 forthcoming) Special Issues on Expertise and Expert Knowledge: Social Philosophy, 32(6), 33(2), 34 (forthcoming). Taylor and Francis.
    3. Beebe, J.R., Baghramian, M., Drury, L., and Dellsen, F. (2019) ‘Divergent perspectives on expert disagreement: Preliminary evidence from climate science, climate policy, astrophysics, and public opinion’, Environmental Communication, 13(1), 35–50. doi: 10.1080/17524032.2018.1504099
    4. Dellsén, F. (2017a) ‘When expert disagreement supports the consensus’, Australasian Journal of Philosophy, 96(1), 142–156, doi: 10.1080/00048402.2017.1298636
    5. Dellsén, F. (2017b) ‘The heuristic conception of inference to the best explanation’, Philosophical Studies. doi: 0.1007/s11098-017-0933-2
    6. Dellsén, F. (2018) ‘The epistemic value of expert autonomy’, Philosophy and Phenomenological Research. doi: 10.1111/phpr.12550.
    7. Dellsén, F. and Baghramian, M. (eds.) (2020, forthcoming) Special Issue on Disagreement in Science. Synthese: An International Journal for Epistemology, Methodology and Philosophy of Science.
  4. Gass, N. (2016) ‘Trump: “The Experts Are Terrible”’,, 4 April.
  5. Gove, M. (2016) Sky News, 3 June.
  6. See for instance Cassam, Q. (2019) Conspiracy Theories, London: Policy Press; Lynch, M. (2019) Know-It-All Society: Truth and Arrogance in Political Culture, New York: Liveright Publishing; and Tanesini, A. (2018) ‘Arrogance, anger and debate’, Symposium: Theoretical and Applied Inquiries in Philosophy and Social Sciences, 5(2), 213–227.
  7. It’s a testimony to the popularity of this and cognate topics that many similar invitations were soon to follow, including one by the Pew Centre and The Economist to their global Evidence Initiative.
  8. ALLEA working papers by the Working Group on Trust, Truth, and Expertise. Working Paper 1: Loss of Trust? Loss of Trustworthiness? Truth and Expertise Today. Berlin, Germany: All European Academies, 2018. Working Paper 2: Trust Within Science: Dynamics and Norms of Knowledge Production. Berlin, Germany: All European Academies. Working Paper 3: Trust in Science and Changing Landscapes of Communication. Berlin, Germany: All European Academies, 2019.
  9. Sperber, D., Clément, F., Heintz, C., Mascaro, O., Mercier, H., Origgi, G., and Wilson, D. (2010) ‘Epistemic vigilance’, Mind & Language, 25(4), 359–393. doi: 10.1111/j.1468-0017.2010.01394.x

Copyright © Education Matters ®  | Website Design by Artvaark Design