After closing down the call for comments “which responsibility” I have received comments and proposals to contribute further to our call, to reflect on the concept of responsibility in the complex context of our cotemporary society.
I publish here with pleasure this piece by Mariachiara Tallacchini, hoping that it will invite more comments and contributions from all those who needed more time to reflect, than those imposed by the call for comments.
Any such comments, or proposals to contribute to the blog in this sense, will be very welcome.
Cristina Grasseni
From Causality to Responsibility: Steps towards the Unknown
Mariachiara Tallacchini
With better involvement comes greater responsibility. Civil society must itself follow the principles of good governance, which include accountability and openness.
(European Governance. A White paper, 2001)
Contemporary scientific knowledge is characterized increasingly by uncertainty (O’Riordan and Cameron, 1994). This is due not only because both the risks and the unpredictability linked to it are increasing, but above all because of the intrinsic incompleteness and indeterminacy of scientific knowledge compared with the needs to make social choices, public policy, and legal decisions. The expression scientific uncertainty has been used to refer to different forms of lack of information in science: the complexity of knowledge, the lack of data, the unpredictability of results, and the stochastic character of predictions. This means that more and more often, the experts involved in regulatory science are unable to adopt an unequivocal position, and, therefore, order to make them free from values and subjective opinions have been widely explored by legal scholars and political scientists as ways both to shaping legal systems according to the rules of logic and to founding the social contract on scientific bases. From the legal point of view, science is still considered both as the ultimate methodological referent and as a separate entity within society. Hence, any parallel between the scientific system and the legal system may be seen only as a remote exchange between forms of knowledge with substantially incommensurable and non-communicating methodologies and goals.
By and large, this approach has also influenced the legal regulation of scientific activities and products. Since science is considered an independent social institution which uses objective criteria to determine which knowledge may be deemed valid in a given situation, the law that interacts with science to regulate it is conceived of essentially as a technical norm bound acritically to acknowledge knowledge ascertained and evaluated elsewhere.
Beginning with the eighties, uncertainty in science has been widely explored after philosopher of science Ian Hacking remarked that the centrality of ignorance in contemporary science has not received attention enough as to its epistemological statute (Hacking, 1986). According to Smith and Wynne, lack of knowledge may lead to different situations: risk, uncertainty, ignorance and indeterminacy (Smith and Wynne, 1989). In decisions under conditions of risk, the main variables of a problem are known and the respective probability of different outcomes is quantified. In contrast, in decisions under conditions of uncertainty, even if we know the main variables of a system, we do not know the quantitative incidence of the relevant factors, and so we ignore the probability of an event. A different definition qualifies uncertainty as Fa probability of the second order_ (Bodansky, 1994). This means that, while in cases of risk we can quantify the probability of the event, in cases of uncertainty we can only quantify the probabilities relating to alternative risk assessments. Ignorance is the situation defined as that of Funknown unknowns_ (European Environmental Agency, 2001), when, since the basic elements of a problem are unknown, the possible negative outcomes are also unknown, they are unpredictable unless new cognitive elements emerge. Finally indeterminacy is the concept that summarizes the basically open and conditional characteristic of all knowledge, particularly its contextual meaning and its socio-cultural determination. Scientific uncertainty seems to challenge the reliability of decision-making process. The last few years have seen the radical subversion of the conditions that made the theoretically neutral and separate relationship between science and law tenable. Scientific activities and products subjected to the scrutiny of law have increased exponentially, and contexts have appeared in which science has at once created risks and proved largely incapable of controlling them (Raffensperger and Tickner, 1999). The technoscientific component has increasingly constituted the cognitive content of norms, but the number of situations is increasing in which law has to fill cognitive gaps, since scientific data prove uncertain, insufficient or susceptible to sharply diverging interpretations.
On the one hand, the strong presence of scientific learning in subjects of normative competence means that it is necessary to explore relationships between science and law as an intersection between scientific and legal concepts and qualifications. On the other, the indeterminate or uncertain character of much scientific knowledge poses the problem of selecting specific norms to overcome the gaps left by science.
The problem of the legal treatment of uncertainty is at the root of the precautionary principle, that can be understood – and has been described – as a principle of responsibility. The precautionary principle (PP) was introduced internationally in 1992 – as precautionary approach – by Principle 15 of the Rio Declaration on Environment and Development, that states that “(w)here there are threats of serious or irreversible damages, lack of full scientific certainty shall nor be used as a reason for postponing cost-effective measures to prevent environment degradations”.
The Maastricht Treaty (Art.130 R, par.2, art.174 of Amsterdam EC Treaty, now European Convention, Sec. 5, Environment, Article III-129) presented the PP for the first time as distinct and autonomous from the principle of prevention. Some overlaps exist between precaution and prevention. The preventative element is certainly present in the PP, even if it is a question of the prevention of a damage only potentially hypothesized. It is more correct to speak of anticipatory aspect: i.e. the anticipation of the (political) judgment of the presence of signs of causality in absence of ascertained causal links.
The most interesting interpretation of precaution has developed according to this vision: the awareness that the law must intervene “even before a casual link has been established”, where the anticipation does not hint at a general preventive intervention, but it hints at the critical awareness that causal and scientific evidence may be achieved too late or may be unattainable (Bodansky, 1994). Thus, law and science appear to complement each other in decision-making process under conditions of uncertainty.
With PP, law frees itself from submission to science and it works out a critical position that acknowledges a positive role to ignorance. As Bodansky has outlined, “Risk assessment, unlike the precautionary principle, generally assumes that we can quantify and compare risks. It is information intensive and rational. Moreover, it can and often does take a neutral attitude towards uncertainty. (..) In contrast, the precautionary principle is not neutral towards uncertainty – it is biased in favor of safety” (Bodansky, 1994: 209). The passage from a two-value science to a three-value science is fulfilled: from an idea of scientific quality which confines itself to evaluating the truth/falsity (verification/not verification) of the scientific hypothesis, to a science which expressly considers and recognizes the hypothesis of uncertainty and of indecision. The need for this three-value science, as Shrader-Frechette has pointed out, depends on an essential difference between theoretical science and science applied to risks. Actually, while the former moves in the abstract perspective of true/false, the latter is connected to the real and complex question of risk acceptability or unacceptability (Shrader-Frechette, 1996). In risk analysis, two different kinds of errors may happen in decisions under uncertainty: errors of type-I occur when one rejects a true null hypothesis (a claim of no effect); errors of type-II occur when one fails to reject a false null hypothesis. In assessing environmental impacts, in a situation of uncertainty where both types of error cannot be avoided, when we minimize type-I error, we minimize the error of rejecting a harmless development; when we minimize type-II error, we minimize the error of accepting a harmful development. The former depends on an excessive scientific optimism, the latter on an excessive prudence. The prospect inherent in the precautionary principle tends to reduce as much as possible the mistakes that produce risks for people, considering that it is better to make a mistake harmful to the economy – a mistake that limits development not risky in itself – but not harmful to people.
In 2000, the Communication of the European Commission on the PP qualified it as a general principle of the European Union for human, animal, vegetable, and environmental health (Commission of the European Communities, 2000). The PP – the Commission says – must be considered inside a unitary process of risk analysis (communication and management) and may be used when scientific information is inadequate, inconclusive and uncertain. Once evoked, the PP may be applied by adopting different measures of information and protection, as well as deciding not to adopt any particular measure. But what the Commission makes very clear is that the PP is a principle of responsibility, namely the principle that considers certain risks as “inconsistent with the high level of protection chosen for the Community”, and “an eminently political responsibility”.
The PP is the object of great criticism by the scientific world which judges it to be a kind of obscurantism and an instrumental support of the people’s irrational fears. The philosophical and moral reflection which, at the roots of its theoretical foundation, has had a great impact on this interpretation is Hans Jonas’ perspective of the heuristics of fear (Jonas, 1985): according to the Author, when confronted with scientific uncertainty and in order to protect what is possibly at stake and what we must beware of, it is wiser and more responsible to accept the priority of the prophecy of doom on the predictions of hope.
It is interesting to observe that Jonas has provided the PP with a psychological foundation – the feeling of fear – instead of an epistemic one. In Jonas’ philosophical vision, there is no room for a cognitive dimension outside the objectivity and certainty of science. Lack of full knowledge is also lacking an epistemic statute and ignorance is more a psychological position than a cognitive one. Accordingly, fear appears as a substitute for cognitive dimensions towards the unknown, and an adequate mechanism for a prudent behavior. But uncertainty is not just a synonym for nonrationality or irrationality. According to Hacking, we should reflect on the statute of lack of knowledge in its cognitive aspect and determine our actions accordingly. This means a behavior of active scientific wisdom combined with the awareness of the value-laden dimensions of science, and strengthened by the use of procedures aimed at making choices more legitimate, objective and shared.
But this position does not reflect the reality of the PP. Although the PP is considered the most characteristic feature of an emerging European epistemological identity in science policy (Tallacchini, 2002), it is hard to see it as an innovative principle in the political decision-making process.
Even though some legally binding European documents, such as the Directive 2001/18/EU on the deliberate release of Genetically Modified Organisms (GMOs) or the Directive 2004/40/EC of 29 April 2004 on the minimum health and safety requirements regarding the exposure of workers to the risks arising from electromagnetic fields, make the consultation with the public mandatory, these procedures do not unequivocally reflect a more democratic attitude towards science-based policy, but may be aimed mainly at obtaining consensus. In fact, according with the Communication on the PP, the principle can be institutionally evoked only by the European Commission and no legal power is granted to citizens about it.
It is important here to observe that both the positivist view of science – denying the existence of uncertainty – and the psychological foundation of the PP (Jonas, 1985) – denying the cognitive side of ignorance – are similar in their easily leading to authoritarian political results. The former is associated with a technocratic perspective where the scientific community informs the content of legal and political decisions. The latter, even when linked to public consultation, can in any case have authoritarian results, relying exclusively on a political will, divorced from a cognitive rationale and supported by public fear. Actually, both the perspectives agree on a certainty-or-irrationality alternative, the model according to which, outside scientific certainty, only opinions or merely felt preferences exist.
But theoretical reflections on the relations among science, policy and the law have gone beyond this alternative, which actually is an absence of alternatives.
Jerry Ravetz (Ravetz, 1999) and Silvio Funtowicz (Funtowicz, 2001), referring to the normative challenges set by life sciences, have coined the expression post-normal science to indicate the situations where “typically facts are uncertain, values in dispute, stakes high, and decisions urgent”.
But the present situation concerning the social impact of technoscience nearly always represents post-normal science: in other words, post-normal science usually represents the normal situation in most of scientific social choices. From this point of view, the PP, so as it has been defined by Principle 15 of the Rio Declaration on Environment and Development, seems to be conceptually superseded. Principle 15 actually describes a lack of full scientific certainty thus implicitly assuming that the normal condition of science is certainty, and that uncertainty is always circumstantial and temporally limited. Again, it concerns an incremental model of science where sooner or later the truth is reached. According to Jean-Pierre Dupuy, in the PP, the very notion of uncertainty is missed: “The key notion here is that of informational incompressibility, which is a form of essential unpredictability. In keeping with von Neumann’s intuitions on complexity, a complex process is defined today as one for which the simplest model is the process itself. The only way to determine the future of the system is to run it: there are no shortcuts. This is a radical uncertainty” (Dupuy, 2004: 80).
In Dupuy’s view, the introduction of subjective probabilities in statistics has allowed the reduction of uncertainty to the concept of quantifiable risk, because subjective probabilities no longer correspond to any sort of regularity found in nature, but simply to the coherence displayed by a given agent’s choices.
“A risk can in principle be quantified in terms of objective probabilities based on observable frequencies; when such quantification is not possible, one enters the realm of uncertainty. It is easy to see that the introduction of subjective probabilities erases the distinction between uncertainty and risk, between risk and the risk of risk, between precaution and prevention. No difference remains compared to the case where objective probabilities are available from the outset. Uncertainty owing to lack of knowledge is brought down to the same plane as intrinsic uncertainty due to the random nature of the event under consideration. […] In truth, one observes that applications of the precautionary principle generally boil down to little more than a glorified version of cost-benefit analysis” (Dupuy, 2004: 78-79).
More advanced perspectives on science policy are overcoming the PP. They are beyond the idea of an emergency principle about science, and they are supporting a more general democratization of scientific expertise and public participation in scientific decisions for public policy. The appearance of risks and uncertainties linked to the social implementation of science has revealed a double need: in the first place, the need to widen consultation with scientists where the divisions of opinions arise about the possible occurrence of potentially harmful events; in the second place, the opportunity to involve citizens more in science-based decisions that directly concern the civil society (Irwin and Wynne, 1996; Nowotny, 2003).
The changes in the relation between science and society are deeply modifying the institutional structures and all the rights that are linked to the notion of a social contract and particularly to the idea of a constitutional state (Fuller, 2000). The political rights granted to citizens in the lato sensu liberal democratic governments have been mostly the ones that help people to determine their political orientation using their vote. The need to make more visible and transparent the decisional procedures inside the institutions has more recently formed a new kind of participation in government action (at least potentially) through what is more and more recognized to be the citizens’ right to know.
The store of warranties which define the very idea of a constitutional state has not adequately affected the relationship among science, individuals and institutions. The appointment of experts, the setting up and running of scientific and technical boards, and the same scientific knowledge considered the expression of an objective and certain method, have not been considered a problematic topic from the point of view of the protection that the state offers to its citizens (De Schutter et al., 2001). The need to introduce specific warranties and rights as well as to promote greater democratic participation of the civil society today specifically concerns science regulation, a field where up to now citizens’ absence has been nearly complete.
This vision of the relationship between science and society does not refuse to acknowledge the privileged character of scientific language. Science may speak particularly reliable words, but it does not have the power to utter the exclusive or final word about social choices. We must establish the conditions of public acceptance of the different kinds of knowledge; we must determine the forms of public control of such knowledge, the different methodological and axiological assumptions, that suggest their operation; no form of knowledge may be asserted only on the basis of a predefined validity or truth.
In this sense, the governance of science is a problem of democratic responsibility. Here, the word democracy does not refer to the predominance of a majority, but to the open and unauthoritative characteristic of any language (including scientific ones). Responsibility is not just the attribution of causality to someone, but one’s willingness to take charge of actions and decisions even though uncertain. In a sense, responsibility does not start with a foreseeable and predictable future but, on the contrary, when we face the unknown.
Every social decision must be screened in different places and through a plurality of knowledge, comparisons and transactions. Moreover, law becomes the place where different knowledge and languages are discussed and guaranteed through the participation of different subjects.
It would be reductionist to interpret such a position as an antiscientific one. It does not consist of a limitation on science and scientists’ freedom – if such freedom is ethically qualified and it is not seen as a merely arbitrary explication. On the contrary, it is a question of favoring a deeper comprehension of the complex links between science and society, determining more adequate ways and procedures in scientific and technological choices, at the root of social and civil transformations.
References:
Bodansky, D., 1994. The precautionary principle in US environmental law. In: O’Riordan, T., Cameron, J. (Eds.), Interpreting the Precautionary Principle. Earthscan, London, pp. 203– 228.
Commission of the European Communities, 2000. Communication from the Commission on the Precautionary Principle, Brussels 2.2.2000, COM(2000)1.
De Schutter, O., Lebessis, N., Paterson, J. (Eds.), Governance in the European Union. Office for Official Publications of the European Communities, Luxembourg.
Dupuy, J.P., 2004. Complexity and Uncertainty a Prudential Approach to Nanotechnology. European Commission, A Preliminary Risk Analysis on the Basis of a Workshop Organized by the Health and Consumer Protection Directorate General of the European Commission, in Brussels 1 – 2 March 2004.
European Environmental Agency, 2001. Late Lessons from Early Warnings: The Precautionary Principle 1896 – 2000.
Fuller, S., 2000. The Governance of Science. Open Univ. Press, Buckingham.
Funtowicz, S.O., 2001. Post-normal science. Science and governance under conditions of complexity. In: Tallacchini, M., Doubleday, R. (Eds.), Science Policy and the Law: Relationships Among Institutions, Experts, and The Public, Notizie di Politeia, vol. XVII, 62, pp. 77-85.
Hacking, I., 1986. Culpable ignorance of interference effects. In: MacLean, D. (Ed.), Values at Risk. Rowman & Allanheld, Totowa, NJ, pp. 136-154.
Irwin, A., Wynne, B. (Eds.), 1996. Misunderstanding Science? The Public Reconstruction of Science and Technology. Cambridge Univ. Press, Cambridge.
Jonas, H., 1985. The Imperative of Responsibility. Search of an Ethics for the Technological Age. University of Chicago Press, Chicago.
Nowotny, H., 2003. Democratising expertise and socially robust science. In: Liberatore, A., Funtowicz, S. (Eds.), 2003. Special Issue on Democratising Expertise, Expertising Democracy, Science and Public Policy 3, vol. 30, pp. 151-156.
Nowotny, H., Scott, P., Michael, G., 2001. Rethinking Science: Knowledge and the Public in an Age of Uncertainty. Polity Press, London.
OÂ’Riordan, T., Cameron, J. (Eds.), 1994. Interpreting the Precautionary Principle. Earthscan, London.
Raffensperger, C., Tickner, J. (Eds.), 1999. Protecting Public Health and the Environment. Implementing the Precautionary Principle. Island Press, Washington, DC.
Shrader-Frechette, K.S., 1996. Methodological rules for four classes of scientific uncertainty. In: Lemons, J. (Ed.), Scientific Uncertainty and Environmental Problem Solving. Blackwell, Oxford, pp. 12-39.
Tallacchini, M., 2002. Epistemology of the European identity. J. Biolaw
Bus., Suppl. Ser., 60-66.
Mariachiara Tallacchini is full professor in the Philosophy of law. She teaches ‘Philosophy of Law’ and ‘Science, Technology and Law’ at the Law Faculty of the Catholic University of Piacenza. She also teaches ‘Bioethics’ at the Faculty of Biotechnology of the State University of Milan since 1998, and she was Professor of ‘Science, Technology and Law’ at the Faculty of Philosophy of the University San Raffaele of Milan in 2004-2005. Her main interests focus on the relationships between science and the law. She is author and/or editor (or co-editor) of several books and articles on bioethics, biotechnology, and the law (see her forthcoming Stato di scienza, Casa Editrice Università La Sapienza, Roma 2006; G. de Wert, R. ter Meulen, R. Mordacci and M. Tallacchini, Ethics and Genetics. A Workbook for Practitioners and Students, Berghahn Books, Oxford-New York 2003).