[Tomorrow I am giving a talk in great company, in which I elaborate a bit on the five norms of analytical egalitarianism, which was first developed by the economists Sandra Peart and David Levy (recall also). All the norms are defeasible and require contextual judgment. In this post I discuss the third norm only; see yesterday's post for the first two.--ES]
Recall the first two norms were:
- Experts/philosophers cannot keep themselves (their incentives/their roles, etc.) out of the model/proposal. In practice this means that we can't simply assume that philosophers are disinterested truth-seekers (especially not in the context of policy).
- Experts/philosophers should not promote policies where the down-side risks of implementation are (primarily) shifted onto less fortunate others.
The third norm is:
3. Experts/philosophers should make an effort to educate policy and opinion-makers to counter-arguments to the policies they advocate.
This norm ought to come easy to philosophers, who adore counter-arguments, and model-driven economists, who really understand the effects of even subtle change in one’s estimates. But that's not what happens in practice. Rather, we tend to promote solutions or truth in the public sphere. I am not blind to the fact that the ‘demand’ side – politicians, media, public, etc. – may well want this and select for it (recall incentives above). But just because folk wish to receive The Gospel, it does not follow we ought to give them a gospel. That is, even if one is a Monist about X, one has a duty to explain the possible limitations of X to those that have to live with/abide by X.
Given that even foreseeable consequences may also have unintended side-effects that are invisible to one kind of expert (but, perhaps, foreseeable to well-informed or interested outsiders), we have a duty to articulate assertively known objections (or confidence intervals, etc.) over possible consequences of policy. Even if policy-makers may not wish to hear about model-uncertainty, say, this does not exculpate those who provide their ‘clients’ with what they want to hear. I would argue that we have professional duties that regardless of policy-makers’ wishes may demand fuller disclosure or remaining silent. Some of us receive a great deal of professional privileges and access to ‘success goods,' and this comes with duties attached to them.[1]
There are additional reasons to adhere to this norm because even if a field (philosophy or economics) has a consensus, unless the field instantiates perfect communicative rationality, which it never does in the real world, there will be suppressed objections. For, in practice, self-affirming benchmarks may well be woven into a field’s standard practice.
It is well known that alternative models and even long-standing objections can get suppressed from a discipline’s collective tool-kit and memory. In the philosophy of science literature, the suppression of long-standing objections or even reliable alternative approaches is known as a “Kuhn-loss.” In particular, insights of discarded theories that cannot be articulated or recognized by the new theory are instances of Kuhn-losses. This is “suppression” in non-moral sense; it’s about epistemically important practices that set aside, say, questions, anomalies, or results in pursuit of more epistemically promising alternatives. Kuhn is significant here for a related point. He helped popularize a view of paradigms that allowed social-scientific practitioners to claim that they need not answer all objections; a welcome result in some consensus-aiming policy sciences, but can easily turn into a form of morally objectionable suppression of alternative viewpoints.
I have treated this norm as duty here. But it can also be understood as a learning opportunity. We are often charmed by solutions to problems. When we wish to improve world we may allow ourselves to miss important consequences that follow from implementation of our preferred solution. Forcing oneself to articulate knowable objections may well teach us something. Asking those that may be affected by such implementation is a useful strategy in this; interested parties sometimes catch problems where the expert may not.
[1] I have argued for this in Lefevere, Merel; Schliesser, Eric "Private epistemic virtue, public vices : moral responsibility in the policy sciences" Forthcoming in Experts and consensus in social science (2015), edited by Carlo Martini & Marcel Boumans Dordrecht: Springer. This post draws on our joint work
Comments
You can follow this conversation by subscribing to the comment feed for this post.