Transportation reliability and vulnerability

philosophy-of-scienceThis is a philosophical essay on transportation vulnerability, where three fields or subjects are brought together : engineering (reliability and vulnerability), economics (cost and benefits) and politics (decision making). The idea behind the research is to blend statistical, economical and political arguments in order to achieve a novel and unifying framework for decision making within transportation planning. By adding reliability and vulnerability to the traditional equations of costs and benefits it is hoped that transportation planners and professionals will not only consider economical arguments, but also dare to take on political statements that may be in opposition to strictly factual costs and benefits.

Originally written in April 2004, revised and updated in March 2009

1 Introduction and outline

Independent thought, critical analysis, individual judgement and expertise in the application of research theory and practice are but a few of the ingredients necessary to formulate, develop, carry out and successfully complete any kind of research work. Unfortunately, many researchers are unaware, or worse: blatantly ignorant, of the underlying tacit assumptions that operate in their field of research and under which they work. This paper will address this issue for the author’s own research on reliability and vulnerability as input parameters in cost-benefit analyses. Thus, the aim of this paper is to clarify which scientific approach or which school of thought this author is inclined to follow in his research, and how this has affected and will continue to affect his line of research.



A research subject that sees its aim as decision support, balancing different sides and arguments, rather than holding one objective truth, has much in common with the explorative nature of Lakatosian research programmes. At the same time it has no overall common assumptions or paradigms it needs to adhere to in Kuhnian sense: Reliability analysis is deductive in nature; vulnerability adds a Bayesian vagueness to reliability. Costs and benefits are firmly set in the utilitarian thought of classic economics, implying society as the sum of rational individuals. Combining these diverging views is challenging, but not insurmountable, if following Giordano Bruno’s ideas of pictorial concepts, rather than developing intricate formulae.

Firstly, the scientific and historical backdrop for the terms reliability, vulnerability and cost-benefit analysis as they are used in the henceforth research will be described. Secondly, different approaches to knowledge in resolving the methodological and epistemological problems within the research will be evaluated. Thirdly, and finally, the author’s own approach and insights will be stated.

2 Defining the object of analysis

The working title of the research project is “when you cannot get from here to there” or in proper academic terms “reliability and vulnerability versus costs and benefits”. It aims at investigating the most typical factors that affect the reliability, and particularly, the vulnerability of the transportation infrastructure and in consequence establishing a framework for how these factors should be approached in terms of costs and benefits.

Rationale

The rationale for the research is that the reliability or vulnerability of the transportation network is only seldom looked at as a decision parameter in cost-benefit analyses, especially for new projects. For the most part, saved travel time is what drives the policy of the decision makers, increased reliability is as a rule not a subject for closer evaluation; it is simply taken for granted. By adding reliability and vulnerability to the traditional equations of costs and benefits it is hoped that transportation planners and professionals will not only consider economical arguments, but also dare to take on political statements that may be in opposition to strictly factual costs and benefits. In essence, three fields or subjects are brought together, engineering (reliability and vulnerability), economics (cost and benefits) and politics (decision making). The purpose of this essay is to bring forth and highlight the epistemology, ontology and methodology of the three fields that make up the backdrop for this research, and to look at how their tacit assumptions will affect the undertaking of the research that is to follow.

Is this research scientific?

Webster’s Dictionary defines “science” as “the state of knowing: knowledge as distinguished from ignorance or misunderstanding”, and everyday connotations often associate science with “objective knowledge” or “absolute truths” about an object of study, a knowledge that not necessarily needs to be acquired through personal experience. Knowledge applies to facts or ideas acquired by study, investigation, observation, or experience. With the author’s engineering, and thus commonly established “scientific” background in mind, it is only natural for him to search for some objective knowledge in his chosen field of research. The Oxford Advanced Learner’s Dictionary defines science as “knowledge about the structure and behaviour of the natural and physical world, based on facts that you can prove”. The question then arises to the author: Firstly, does there or does there not exist a factual and provable knowledge about his field of study, why, or why not? Secondly, where in the realms of the sciences or knowledges is this research situated and from which perspective is he (am I) approaching his (my) research? These are the question that this essay will attempt to answer.

Where did it all start? The author’s initial interest in his particular research was stirred by the research work done by Katja Berdica for her PhD (Berdica, 2002) at the Royal Institute of Technology in Stockholm, Sweden. Her thesis becomes much of the bedrock foundation for the undersigned author, as he picks up where Berdica left:

Road vulnerability analysis could thereby be regarded as a hub for the whole battery of transport studies needed to gain the insights necessary to describe how well our transport systems work in different respects, what steps to take and what policies to implement in order to reach desired goals. (Berdica, 2002, p.127)

It is tempting to then view vulnerability analysis in terms of the Lakatosian research programme, spinning off into various strands (batteries) of research, each strand developing different or diverging insights, from which only the fittest will survive. At the same time, there is also a concept of converging insights, and Berdica suggests that:

[…] vulnerability in the road transportation system should be brought out and recognized as a crucial part in the infrastructure, and […] be the meeting point for all the different strands of transport reliability research and other issues. (Berdica, 2002, p.127)

Such a view acknowledges that an interdisciplinary approach is necessary to bring the research forward and to develop new ideas. Subjecting hard core (transportation network) reliability to interdisciplinary critique will undoubtedly allow for the protective belt to be tested in order to generate new ideas. It is hoped that the research that underlies this essay will provide the necessary drive to turn transportation network reliability studies into living and generating, not degenerating, research.

Transportation network analysis has long has its roots in the natural sciences and civil engineering departments, and traditionally, when looking at the reliability of transportation networks, it is done with a systems engineering approach. Reliability is here dependent upon functioning or non-functioning links within the network, and may be defined as the degree of stability of the quality of service that a transportation system offers.

More often than not, transportation network reliability is limited to two aspects only, videlicet, connectivity and travel time reliability. Although it conceptually makes perfect sense to sum up reliability in these two variables, there are an almost unlimited number of factors that may or may not contribute to the final value, and thus, should be part of the overall equation. Thus, an interdisciplinary Lakatosian research programme approach is definitely required.

From a user perspective, what matters most in relation to a transportation network is its availability at the given time of travel, in other words, can I get from A to B by using the intended route and means of transport, at the desired time of travel, which would be the best case. Or, there exists no route or means of travel at all, which can take me from A to B at the desired time of travel, which to the user would be the worst case. Consequently, the probability of the network of being available, accessible and expressing the desired level of service may thus be used as a measure of reliability. Conversely, the road network’s susceptibility to not being available, accessible and expressing the desired level of service may thus be construed as a measure of vulnerability.

3 Reliability, vulnerability, costs, benefits and decisions

Reliability

Robert Lusser is by many regarded as the father of reliability, and Lusser’s law is perhaps the most quoted and used equation in reliability engineering. It defines the reliability of a series system as the combinatorial product of the individual component reliabilities. Formulated during WWII, it established the notion that no system is stronger than its weakest component, and was first used to improve the success rate of the German V-1 bombs that rained over England.

The reliability of any object as such is most commonly described as the probability that this object will function (or fail) according to an object-specific set of standards. Put very crudely, it is the ratio of observed functioning and observed non-functioning events. If an item functions in 95% of the observed cases, then it is 95% reliable, or in other words, the probability that the item will function is 0.95, since reliability by convention is described as a number between 0 and 1 rather than in percentage numbers. The literature on reliability is immense, as it draws upon statistics and probability theory, both immense subjects of their own. Suffice it to say that probability, and thus reliability, take their roots from deductive reasoning based on statistics. Probability theory traditionally presupposes classical set theory/classical logic, relying heavily on deductive reasoning. Does reliability (probability) theory qualify as science? Yes, in the most classical sense it appears to be, because it claims to absolutely determine the probability of the functioning of an object. If an object is 95% reliable, it will function 95 out of 100 times. Consequently, in maintenance practice, the object is replaced before it reaches its lifetime of 95 probable ons and offs. Nevertheless, the object could fail already long before its probable lifetime, or go on working well beyond it. In this sense, probability turns into possibility, and no longer is a science.

Statisticians mostly base their assumptions on observations of real world events, from which certain predictions (future events) or correlations (past events) are made. If the deductions are correct, the theory will live up to the experience, or vice versa. However, how can one be certain that what is observed represents the full truth about the world; or if not the world, then at least the truth about the subject of interest? This is where Bayes’ theorem comes to play an important role. Bayes’ theorem is used in statistical inference to update estimates of the probability that different hypotheses are true, based on observations and knowledge of how likely those observations are, given each hypothesis. In essence, in applying a Bayesian approach to probability, and thus reliability, it is possible to come up with a degree to which we feel our predictions are true (probable). Bayesianism is the philosophical tenet that the rules of mathematical probability apply not only when probabilities are relative frequencies assigned to random events, but also when they are degrees of belief assigned to uncertain propositions. Bayes’ theorem thus becomes a way of fitting reliability with a less absolute notion, while still upholding the scientific credibility of delivering the one true answer.

Vulnerability

Berdica defines the vulnerability of a road transportation system as the susceptibility to incidents that can result in a considerable reduction in road network serviceability, with reliability describing adequate serviceability under the operating conditions encountered at a given time.

With vulnerability as a complement to reliability, the issue of vulnerability can now be described in the well-established and solid framework of reliability theory. Consequently, the methodology used in analysing vulnerability will be analogous to the methodology used in analysing reliability. This being the case, we may see vulnerability as non-reliability and vice versa.

Again transferring this to the Lakatosian view of research programmes, vulnerability analyses then represent the positive heuristics that that are legitimate and worthwhile pathways of research (Smith, 1998), with the underlying theories of reliability being the hard core holding the key foundational assumptions. Properly and adequately describing what constitutes vulnerability then (in a strange twist of literal irony) becomes the protective belt of auxiliary hypotheses which can be tested and falsified.

Note here that there is one important addition to the traditional reliability approach that appears in Berdica’s application of reliability to describe the vulnerability of a transportation network: Reliability is linked explicitly to the ability of supplying adequate serviceability, not simply to the probability of failure, as in the classic case of reliability studies. The sheer notion of susceptibility implies something more than probable and statistically predictable failure, namely the influence of external circumstances, circumstances that cannot be predicted, and which may, but not necessarily will have an influence on the reliability numbers. In this sense, pure reliability appears as being strongly independent from external circumstances; vulnerability on the other hand is not. Reliability as such is an intrinsic and stable value, vulnerability is variable. This perspective on vulnerability leads to the conclusion that although there can be a quasi-objective measure of reliability, there can not be an objective measure of vulnerability. Consequently, vulnerability is less measurable than, or not even as measurable as reliability, and is thus open to discussion.

This brings us back to the previous statement of research programmes, and it is the aim of this research that bringing vulnerability and reliability into the field of transportation network analysis will generate new debate, new ideas and new research avenues.

Costs and benefits

Jeremy Bentham (1748 -1832) is often credited with being the father of cost-benefit analysis, one of the many tools of both macro- and microecononomical analysis. Bentham is one of the first utalitarians, and devised a means of calculating the consequences of various actions in terms of utility. The underlying principle of his felicific calculus is still applied in modern-day cost-benefit analysis, but has also found its way into politics and economics. Using Benthams felicific calculus it is possible to ascertain which action will cause the greatest good for the greatest number. The felicific calculus was sketched by Bentham in chapter 4 of his Introduction to the Principles of Morals and Legislation, see Burns et al. (1970):

When determining what action is right in a given situation, we should consider the pleasures and pains resulting from it, in respect of their intensity, duration, certainty, propinquity, fecundity (the chance that a pleasure is followed by other ones, a pain by further pains), purity (the chance that pleasure is followed by pains and vice versa), and extent (the number of persons affected). We should next consider the alternative courses of action: ideally, this method will determine which act has the best tendency, and therefore is right.

Since classical utilitarian theory considers the rightness of an action as a function of the goodness of its consequences, the felicific calculus could, in principle at least, establish the moral status of any considered act. In strictly utilitarian terms the felicific calculus allows for society to justify inequality, if this, in sum, generates more happiness than equality. Critics would claim that the felicific calculus fails as a moral theory because there is no principle of fairness. Carried to the extreme, it would be moral to torture one person if this would produce an amount of happiness in other people outweighing the unhappiness of the tortured individual. Critics also point out that the happiness of different people is incommensurable, and thus a felicific calculus is impossible in practice.

Nevertheless, with a lesser moral undertone the basic principles of this method are still used today, and are the linchpin of today’s cost-benefit analysis: If the overall benefits (to society) of any action or decision outweigh the overall costs (to society), then the action or decision is justified, or at least economically sound. On a personal side note it may be worth pondering the welfare system of today’s modern society, which is built on achieving equality for everyone. Consider, for a moment, whether more equality among people also means more happiness among them, or whether inequality in fact is justified, because the happiness of the better-offs outweighs the misery of the worse-offs.

Utilitarianism was revised and expanded by Bentham’s more famous disciple, John Stuart Mill. In Mill’s hands, “Benthamism” became a major element in the liberal conception of state policy objectives. Utilitarianism has been deserted as a moral philosophy in modern day society, but continues to live on in two key areas, in the reasoning of everyday life and in neoclassical economics. Classical economic theory is founded on Bentham’s ideas, where the hedonistic nature of the individual as a pleasure-seeking and pain-avoiding individual (aka consumer) remains unfettered and unchanged by religion, philosophy, moral or culture. More euphemistically sounding but not less hedonistical in nature is Adam Smith’s economical man. At the heart of economic theory is homo economicus, the economist’s model of human behaviour. In traditional classical economics and in neo-classical economics it was assumed that people acted (only and always) in their own self-interest. Adam smith argued that society was made better off by everybody pursuing their selfish interests through the workings of the invisible hand, meaning the ability of the free market to allocate factors of production, goods and services to their most valuable use, everybody acting from self-interest, spurred on by the profit motive

Does cost-benefit analysis qualify as science?

Halvorsen (1989) answers this question in part, initially accepting, based on Chalmers (1982), and then finally discarding, Lakatos’ idea of research programmes as a valid view of how economic reason develops new ideas and theories. The point he makes is that human beings seldom are acting rational and in pure self-interest, and thus rarely are in compliance with economic models. Humans, he argues, act upon the norms and morals of society, often in conflict with any sense of utility-seeking rational behaviour. In a Lakatosian sense, these contradictions in the protective belt are the motive power towards change and development of economic theories. However, these contradictions are usually not testable for every individual, and economists therefore must rely on theory rather than experience.

The most concise and probably most used definition of economics is the study of how society uses its scarce resources. Thomas Carlyle, a 19th-century Scottish writer, coined the phrase “the dismal science” when referring to economics. The phrase “dismal science” refers to the unreliability of economics in comparison to conventional sciences such as mathematics, physics, or biology, the unreliability being the inability to ascertain any clearly-defined laws because in the market some unidentifiable factor always may be influencing a particular event or phenomena. It’s nearly impossible to isolate any given variable in economics, so the dismal science is mainly based on theories. These theories may contradict each other, like efficient market theory and behavioural finance, but they may be proven true in certain cases or even at the same time. Furthermore, when studying economics, evidence often turns out to be coincidence more than a fact. Andreassen (1989) elaborates on this in describing the rational man as a fictional and virtual construct that only exists in the mind of economists, a Frankenstein consisting of bits and pieces of various separate models of behaviour. The economic man then becomes the representation and simplification of a reality that is too complex to be understood and fully explained by economists. If it is the objective of science to deliver knowledge and truth about the (real) world, then economy fails miserably as science. Nonetheless, economists often put a high degree of confidence in their models and predictions, with contradictory evidence as a result. The recent debate over interest rates and deflation in Norway, though unrelated to cost-benefit analysis, may serve as a small illustration here.

A discussion of the rational behaviour of individuals is particularly necessary in the analysis of the reliability of transportation networks. From a systems theory point of view the matter of functioning or non-functioning links seems easy enough. What is often forgotten is that the transportation system is populated by individuals whose behaviour is not easily predicted. Albeit the increasing numbers of road rage suggest that many travellers or drivers act out of self-interest, we can also compare traveller behaviour to the well-known prisoner’s dilemma (Smith, 1998), in adjusting one’s own travel habits by second-guessing other travellers behaviour.

Decision-making

The reason for bringing reliability and particularly vulnerability into the realm of cost-benefit analysis is to have decision makers consider other points of view than purely monetary or “scientifically objective” numerical (i.e. “true”) arguments.

Decision making is a process that leads to a choice between a set of alternatives, and can be split into the following sequential process :

  • Defining the decision problem (objective)
  • Determining the set of evaluation criteria to be used
  • Weighting the criteria
  • Generating alternatives
  • Applying decision rules
  • Recommending the best solution to the problem

Any decision-making process begins with the definition of the problem or the objective to be reached. Once the decision problem is defined, what follows is setting up a set of criteria that reflect all concerns of the problem and finding measures as to which the degree is achieved. The purpose of weights is to express the importance or preference of each criterion relative to other criteria. Alternatives are often determined by constraints, which limit the decision space of feasible alternatives. Decision rules integrate criteria, weights and preferences to generate an overall assessment of the alternatives. Recommendations are then based on a ranking of the alternatives.

However, all decision making has a degree of uncertainty, ranging from a predictable (deterministic) situation to an uncertain situation. The latter one can be subdivided into stochastic decisions (which can be modelled by probability theory and statistics) and fuzzy decisions (which can modelled by fuzzy set theory and others). In some situations, decision making involves the risk or some uncertainty of making a “wrong” decision, because the information acquired is insufficient or the approach used is inappropriate. Using Bayes’ theoretical framework, the uncertainty that is involved may in some cases be quantified and as such add another decision criteria to the evaluation process.

In the end, it is the decision maker, who determines the criteria, the factors, the constraints, the individual weighting and the decision rules. Decisions based on reliability and probability have a scientific aura around them, being absolute by number; decisions based on vulnerability are more relative and descriptive, not absolute, since vulnerability is associated with susceptibility, meaning the state of being very likely to be influenced, harmed or affected by something. This is an individual issue, not easily translated into collective terms. Then again, the aim of this research is not to find a justifiable measure of vulnerability, but to point at vulnerability as a parameter that needs to be included in cost-benefit analyses and to show how this can be done. This being the case, this research warrants only scientific connotations, not scientific denotations.

4 Implications for the research and development of new ideas

Bentham’s felicific calculus was said to be impossible in practice, because the (different) happiness of different people was incommensurable and could not added together into a single value. Yet this calculus is exactly what is applied in today’s cost-benefit analyses, and is rooted in the economic theory of supply and demand. The demand of each individual is added together as society’s total demand for a good at a given price. In similar manner, the demand representing the value that each individual attributes to this good, representing each individual’s valuation of the good is added together as society’s total valuation of the good. This is a representation, not a fact. Consequently, it must be kept in mind that cost-benefit analyses are a simplification of complex sets of value and appreciation, not hard core facts.

Mc Closkey’s now classical article on The rhetoric of economics (Mc Closkey, 1983) contends that economics, in essence, like the proverbial emperor, has positively no clothes on, yet economics has always claimed and continues to claim that it is steeped in science. Mc Closkey goes as far as suggesting that statistical significance, regardless of how useful it is as input into economic significance is not the same thing as economic significance. Hajek and Hall (2002) note that hypothesis testing at a constant significance level is inconsistent with Bayesian inference and decision theory, and thus both classical and Bayesian statistics need to be properly integrated into a more general theory of decision. This is also noted by Aven (2003) in Foundations of Risk Analysis. Making good decisions is to a large extent about the quality of the decision-making process, the definition of the problem, the overall goals, criteria and preferences, the identification of alternatives, the use of risk and decision analyses, and the managerial review and judgement. There are no objective ways of measuring this quality, but one can of course be more or less confident about the process underlying the decision. Taking proper considerations towards all parties is not always easily translated into mathematical formulae, and perhaps it shouldn’t be. In dealing with societal considerations more can probably be gained by deliberations, for people to confer, ponder, exchange views, consider evidence, reflect on matters, negotiate and attempt to persuade each other. This is the hard core of this author’s research.

In the concluding paragraphs in their article on induction and probability, Hajek and Hall (2002), suggest the need for more cross-fertilisation between Bayesian and classical statistics, as seen in Mayo’s theory of error statistics The centrepiece of Mayo’s argument is the notion of severity, referring to the notion that any method or procedure of testing also must consider how the data were generated, modelled, and analysed to obtain relevant evidence in the first place.

Hajek and Hall (2002) foresee that

in this brave new world of inter-disciplinarity and rapid communication, inferential methods developed within one filed are increasingly likely to be embraced by practitioners of another.

Devising a methodology that allows for the inference of reliability and vulnerability is one of the tasks in this research. The inferred values of reliability and vulnerability are then used in cost-benefit analyses.

The purpose of a cost/benefit-analysis is to weigh the costs of a proposed project against the benefits of the project. If the benefits exceed the costs, then the project increases society’s welfare. If the costs exceed the benefits, society will experience a loss of welfare. The argument of vulnerability versus reliability is analogous: Any vulnerability of the transportation system causes disruptions that cause costs, which are in fact a loss of welfare; vulnerability is thus a cost that is quantifiable. If society puts measures in place to reduce the vulnerability of the transportation network, the increased reliability represents a benefit.

One question that remains open at this stage in the research is whether the inferred values or measures of reliability and vulnerability are in fact commensurable or not. Then again, keeping the above argument of confidence of arguments at hand, it may not be required, if decision support is an analytical tool providing well-founded information and not final answers.

5 Conclusion

The initially stated purpose of this essay was to bring forth and highlight the epistemology, ontology and methodology of the three fields that make up the backdrop for this research, and to look at how their tacit assumptions will affect the undertaking of the research that is to follow.

Whether or not this has been critically evaluated to the point remains to be seen in the forthcoming research. Nonetheless, the essay has raised the awareness of not entering headfirst into a research without first acknowledging and appreciating the historical and philosophical timeline the topic has evolved under.

The aim of this paper was to clarify which scientific approach or which school of thought this author is inclined to follow in his research. Three fields or subjects are brought together, engineering (reliability and vulnerability), economics (cost and benefits) and politics (decision making). The idea behind the research is to blend statistical, economical and political arguments in order to achieve a novel and unifying framework for decision making within transportation planning.

In summing up, this research may very well be a discourse of its own, not fitting into any particular of the traditional contexts that surround the three fields or subjects. What it tries to do is to blend statistical, economical and political arguments in order to achieve a novel and unifying framework for transportation planning. Reliability and vulnerability are seen as crucial input parameters underpinning all further research. The probability of the network of being available, accessible and expressing the desired level of service may be used as a measure of reliability, and conversely, the road network’s susceptibility to not being available, accessible and not expressing the desired level of service may thus be construed as a measure of vulnerability.

Rooted in reliability theory, objective truths or numbers regarding the reliability of the road network are used to assess its possible, yet not necessarily probable, vulnerability, because susceptibility by definition is not always predictable. In this research, reliability or probability is subjected to its Bayesian susceptibility to uncertainty, and a vulnerability analysis takes advantage of and exploits this uncertainty. Bearing in mind that we cannot be certain of the absolute value of the reliability of a road network, the utility to society regarding reliability must also be viewed as uncertain, and so must the ratio of costs versus benefits. The translation of qualitative measures into quantitative measures that is necessary to perform a cost-benefit analysis already involves an element of uncertainty in itself, “worsening” and weakening its objectivity, if the numbers for cost and benefits are addressing vulnerability or reliability. What this goes to show is that there can be no objective measures regarding vulnerability, and thus no objective decisions, in such that any decision to mitigate vulnerability is a qualified guess rather than a well thought-out process. That said, there’s nothing wrong with qualified guesses.

Vulnerability analyses present predictions and uncertainty assessments of observable quantities and provide decision support, not hard core decisions. The analyses need to be put into a wider decision-making context, which we may refer to as a managerial review and judgement process, and this process then results in a decision.

A research subject that sees its aim as decision support, balancing different sides and arguments, rather than holding one objective truth, has much in common with the explorative nature of Lakatosian research programmes. At the same time, it has no common assumptions or paradigms it needs to adhere to in a Kuhnian sense.

Nearing the final conclusion of this essay, in attempting to unify statistics, economics and politics, probably (not meant in a scientific statistical sense) the best guideline for the henceforth research is to follow Feyerabend’s golden rule of “anything goes”. Without a fixed ideology that can comprehensively span 3 fields of research the only approach which does not inhibit progress and discovery of new knowledge is in fact “anything goes”.

anything goes’ is not a ‘principle’ I hold […] but the terrified exclamation of a rationalist who takes a closer look at history.

If rationalism is considered being a philosophical doctrine that asserts that the truth should be determined by reason and factual analysis, then “anything goes” implies that whatever reason constructs and manages to justify as factually true, is what in the end establishes itself as the truth, until a new or different truth is established and justified. In the prospects of this research such a statement means admitting that one’s own research sooner or later will be inadequate and insufficient as grounds for the kind of decision support it is supposed to deliver.

In finishing this paper and preparing for the vexing critique of my contemporaries the words of Giordano Bruno came to cross my mind, as he was facing the his trial in 1600, before being burned at the stake for heresy:

Maori forsan cum timore sententiam in me fertis quam ego accipiam. Perhaps you, my judges, pronounce this sentence against me with greater fear than I receive it. Giordano Bruno ( in: White, 2002)

The philosophical kinship that this author feels toward Giordano Bruno is indeed a fitting finale to this research essay. Giordano Bruno was a philosopher who took the current ideas of his time and extrapolated them to new and original vistas. He claimed that all matter was intimately linked to all other matter, that we live in a universe in which all things are related. Bruno is generally not considered a scientist, yet, ironically enough it is perhaps the scientific element of Bruno’s work that presents us with the most lateral links between his ideas and modern thinking. Bruno was never in any sense a practical researcher, to quote White (2002); he did not think in terms of mathematics or experiment. In fact, he actively disapproved of the way the new science of his time was becoming increasingly entwined with mathematical proof and purity, much in the same manner that this author disapproves of the way cost-benefit analysis represents utility as an economical and mathematical measure that can be compared against other economical and mathematical measures.

Giordano Bruno is unique compared to the other martyrs of his time because of the power of his forward-thinking, where others were personal-thinking or contrary-thinking. Bruno held a broader vision; his heresy was all-embracing. He defended the rights of all humans to think as they wished; he offered an alternative to the ideas enforced by orthodoxy. He was a man who wished to steer humanity towards reason, who wanted to allow us to conceptualise freely rather than have our thoughts determined for us, free from Kuhnian paradigms, free from the hard core assumptions of Lakatosian research programmes.

Bruno advocated the use of conceptualising, that is to think in terms of images. He said that to think was to speculate with images. Complex scientific correlations are often better explained in pictures than in mathematical formulae. Consequently, Bruno was able to rationalise his theories, even though he used no mathematics. Bruno’s vision of picture logic is used by almost anyone in the industrialised world today; just take Windows ® software as an example. The workings of Windows are (as is obviously seen by most users) based upon logically connected images.

Interestingly enough, Bruno predated Karl Popper by three and a half centuries when he wrote:

He who desires to philosophise must first of all doubt all things. He must not assume a position in a debate before he has listened to the various opinions and considered and compared the reasons for and against. He must never judge or take up a position on the evidence of what he has heard, on the opinion of the majority, the age, the merits, or prestige of the speaker concerned, but he must proceed according to the persuasion of an organic doctrine which adheres to all real things, and to a truth that can be understood by the light of reason. (Guiordano Bruno, De immense, De monade, De minimo, 1591)

Rather than spinning his ideas from the yarn of algebra, the cobweb of modern science, Bruno moulded pictures and manipulated visual images to interpret complex ideas. This, by coincidence, comes close to McCloskey’s metaphor-laden rhetoric of economics, and makes the argument that economics presents itself in a Brunoan manner, while attempting to uphold its scientific bedrock of mathematics.

The research that this author is carrying out aims at taking reliability and vulnerability into the realm of cost-benefit analysis to serve as decision support. Decisions should not be determined by numbers alone; decisions should be fully envisioned and comprehended by the decision makers. This is only possible by speculating with images what the outcome of the decision will be. Following Bruno’s lead, leaving the mathematical world of realibility (probability) and cost-benefit (economics) behind, the research that is to follow intends in every way to be rich in images, but poor in formulae.

6 Bibliography

aven-foundations-of-risk-analysisburns-jeremy-benthamchalmers-what-is-this-thing-called-scienceblackwell-guide-to-the-philisophy-of-sciencemark-smith-social-science-in-questionmichael-white-the-pope-and-the-heretic

Andreassen, L. (1989) ‘The representative decision maker: a Frankenstein encounting reality’, in: K.A. Brekke and A.Torvanger (ed.) Philosophy of Science and Economic Theory, Central Bureau of Statistics in Norway

Aven, T. (2003) Foundations of Risk Analysis : A Knowledge and Decision-Oriented Perspective. John Wiley & Sons.

Berdica, K. (2002b) An introduction to road vulnerability: what has been done, is done and should be done. Transport Policy, 9 (2), 117-127

Burns, J.H. and Hart, H.L.A. (1970) The collected works of Jeremy Bentham: An introduction to the principles of morals and legislation. The Athlone Press, University of London.

Chalmers, A.F. (1982) What is this thing called Science? Open University Press.

Hajek, A. and Nall, N. (2002) ‘Induction and Probability’, in: P. Machamer and M. Silberstein (ed.) The Blackwell Guide to the Philosophy of Science, Blackwell Publishers, pp. 149-172.

Halvorsen, K. (1989) ‘Lakatos as a motive power in economics’, in: K.A. Brekke and A.Torvanger (ed.) Philosophy of Science and Economic Theory, Central Bureau of Statistics in Norway

McCloskey, D. (1983) The Rhetoric of Economics. Journal of Economic Literature, 21, June: pp. 481-517

Smith, M.J. (1998) Social Science in question. Sage Publications, UK.

Walliman, N. (2001) Your research project: a step-by-step guide for the first-time researcher. Sage Publications, UK

White M. (2002) The Pope and the Heretic. HarperCollins Publishers

Related

Posted in THIS and THAT
Tags: , , , , , , , , , ,

ARTICLES and PAPERS
Economies of integration
Logistics is no longer what it used to be and logistics today plays a much more important and strate[...]
From the back room to the board room
Supply chain management used to be relegated to the logistics department of businesses and hardly th[...]
BOOKS and BOOK CHAPTERS
SC Design and Management
More than 500-page heavy and laden with real-life examples and thoroughly calculated details, Design[...]
Supply Chain Continuity
Many business owners will have come across the term business continuity, and many supply chain owner[...]
REPORTS and WHITEPAPERS
The supply chain of the future
Many global supply chains are not equipped to cope with the world we are entering. Most were enginee[...]
Engineering transportation lifelines
New Zealand is probably not the fist country that comes to mind when thinking of state-of-the-art tr[...]