The Causal and the Mental

The topic of mental causation in contemporary philosophy of mind is a vexed one, constrained by only one rule: avoid epiphenomenalism. There is a deep sense that there simply has to be mental causation of some kind – our psychological/mental states – beliefs, desires, intentions – must, in some way, have some causal influence over or on our behaviour. Whatever else may or may not be the case, this is something we have to have. This is the data we have to explain.

There are, generally, two broad ways to approach the issue of mental causation (MC): either the mind/body are two distinct substances (substance dualism, SD) or they are not (let’s call this broad physicalism, BP – and we’ll assume that an account of BP aims to be non-reductive). This lets us divide up the problems of MC a bit more easily:

On SD, if the mind and body are two distinct substances, one material and one immaterial, the issue becomes just how to explain, describe and define the causal relation between the mind and body. Simply put, what is the nature of the causal relation here? The objections here are well known: how could an immaterial substance be in any kind of causal relation with a material substance? It just seems odd that such a relation exists – and if the mind is independent from the brain, as SD claims, then what do we make of the cases where, say, a blow to the head (or the aftereffects of brain trauma, such as agnosia) disrupts consciousness or conscious processes? The opposite would seem like the natural conclusion – if the mind truly is independent from the brain, then consciousness should continue uninterrupted by any kind of physical disturbance.

William Hasker identifies a related issue in his book ‘The Emergent Self’:

‘Phillip Quinn doubts this: “I’d be willing to bet that we will learn from neuroscience…that thus processing of visual information goes on in the brain. And if a Cartesian mind requires as input processed rather than unprocessed visual information in order to perform such tasks as identifying faces and reading facial expressions, it will come as no surprise that it can’t perform that it can’t perform such tasks if its damaged brain can’t provide it with the processed visual information it requires as inputs.”

The problem with this lies in the strong probability that, as neuroscience progresses, more and more of our “advanced” mental processes will be found to be associated with, and dependent upon, specific brain processes. When that happens, Quinn’s strategy would lead us to a view of the conscious mind as essentially a passive spectator, enjoying awareness but contributing little or nothing of its own to those results.’ (p. 155)

Epiphenomenalism by any other name…

What of our other position, Broad Physicalism? Here the issue takes a similar form: how to account mental causation with only material substance (only the physical) without lapsing into epiphenomenalism. Immediately an objection jumps out at us: if, as one would think is necessary for BP, physical explanation is necessary and sufficient, what role is there for the mental? It seems as though epiphenomenalism is waiting to pounce.

There’s a few possible solutions here. Supervenient causation is one of those solutions, but it’s difficult to find an acceptable version of it that doesn’t end up becoming reductive. Type-identity theories have more or less gone by the wayside, replaced with a more hopeful alternative, token-identity theories, where mental-event tokens are identified with physical-event tokens, which avoids the problem of multiple realization that plagued type-identity theories – and if. Here the problem becomes one of identification of events, and Donald Davidson supplied an account: an event is identical to another event if their causes and effects are identical. This ties in to his theory of anamolous monism: mental states are not governed by any strict laws, and thus cannot be reduced to physical states or given a pure physical explanation, since physical explanation involves strict laws. This position recognizes that mental events are physical events without reducing them to only physical events.

The payoff here is that if mental events can be identified with brain events, then it is easy to see how there is mental causation. Hasker notes a difficulty here, though, with Davidson’s identity criterion:

‘Put briefly, the theory is circular: causal relations are identified by the events they relate, and events are identified by the causal relations in which they stand. Clearly, one or the other has to be identified in some other way, so as to enable us to break into this closed circle. And since there seems to be no prospect of identifying causal relations except as relations between events, it seems that it will have to be the events themselves which are characterized in some other way.’ (‘The Emergent Self’, p. 38)

Jaegwon Kim notes another difficulty with Davidson’s theory of anamolous monism:

‘Davidson’s anamolous monism fails to do full justice to psychophysical causation in which the mental qua mental has any real role to play. Consider Davidson’s account: whether or not a given event has a mental description seems irrelevant to what causal relations it enters to. Its causal powers are wholly determined by the physical description or characteristic that holds for it; for it is under its physical description that it may be susbsumed under a causal law.’

Kim proposes a different theory of identification: events are exemplifications by substances of properties at a time. This view is well-known and seems to be the strongest account of token-identity, but it suffers from the difficulty not of being circular but of multiplying events beyond reason. If Kim’s theory is true, then, as the well-known critique goes, no stabbing is ever a killing, since the stabbing and killing are two different properties. Simply put, Kim cannot identify events with each other because they are not identical. Brutus stabbing Cesear is not the same as Brutus killing Cesear. The payoff here, as noted by Davidson, is that Kim cannot identify brain events with mental events, which seems to be a significant weakness for a token-identity theory.

Where does this meandering leave us, then? On SD, we have the interaction problem: how do two substances, one material and the other not, interact? On BP, it seems that there is serious difficulty giving an account of mental causation that is actually an account of mental causation. My first reaction is, noting the serious difficulties with thinking of the mind/body relation in terms of causal relations, to think of the mind/body relation in non-causal terms. This is the Aristotelian position of hylomorphic dualism, which, for the sake of sanity I will leave for another day. But the takeaway, I think, is this: perhaps the mind/body relation needs to be thought of in non-causal terms.

Advertisement

T.F. Torrance on Kant and Theoretic Structures

‘There is certainly a profound element of truth here, the fact that in all our knowing there is a real interplay between what we know and out knowing of it. Man himself is a part of nature and is so intimately related to nature that he plays a formative, and nature a productive, role in scientific inquiry, discovery and interpretation. This is everywhere apparent in the magnificent achievements of empirical and theoretic science, but the way in which Kant himself combined the theoretical and empirical components of the epistemic process has grave consequences.

It is certainly to be granted that we do not apprehend things apart from a theoretic structure, but if the theoretic structure actually determines what we apprehend, then what we apprehend provides no control over our understanding. The one way out of that impasse requires a theoretic structure which, while affecting our knowledge, is derived from the intrinsic intelligibility of what we seek to know, and is open to constant revision through reference to the inner determinations of things as they come to view in the process of inquiry. But this is ruled out by the Kantian thesis that the theoretic structure is aprioristically independent of what we apprehend and that there is no possible knowledge of things in their own inner determinations or relations.

While Kant was certainly concerned to show the limits of the pure reason, his theory of knowledge served to reinforce the Enlightenment doctrine of the autonomous reason (e.g. in its Lockean and Cartesian forms alike) and even to exalt it into a position beyond what had hitherto been claimed, where through prescriptive legislation it subdued nature to the forms of its own rational necessities. As F.C.S. Northrop expressed it: ‘For neither Locke nor Hume was the human person as a knower a positively acting creating being. With Kant the position is entirely changed. Apart from the knowing person, which Kant termed “the ego”, the a priori forms of sensibility and categories of the understanding which this ego brings to the contingent data of sense, there would be no single space-time world whatever, with its public, material objects and knowers. In this fashion Kant transforms modern man’s conception of himself from a merely passive into a systematically active and creative being.’ (T.F. Torrance, ‘Transformation and Convergence in the Frame of Knowledge, p. 42, reformatted for ease of reading)

Kant and the Objectivity of Experience

(This is a rough gloss on Strawson’s exposition of Kant’s doctrines of unity and objectivity in ‘The Bounds of Sense’)

– Kant notes that our experience has to include the awareness of objects distinct from the state of being aware of them – call this the objective reference of experience. Put differently, experience has the objective reference of objects conceived as distinct from the particular experience (or representation) of said object.

– This is, in effect, the statement that we have to be aware of the thing-in-itself in order to have an objective reference. Our experience, if it is to have an objective reference, must be unified for it to be a representation of the objective world.

– Our empirical concepts, if they are to be employed at all, depend on this unified, coherent and connected experience.

– The issue here can be seen clearly: the objective world, the world of things-in-themselves apart from any perceptual activity or cognition of the knowing subject, must be known for our experience to have an objective reference, or for our representations to be of the real world. The things-in-themselves, however, lie outside our experience entirely – we are not aware of them. All we are aware of are appearances.

– Thus, if we are to use empirical concepts, we have to have a substitute objective reference. This substitute is, simply, the rule-governed connected-ness of our experience and our representations. Strawson notes:

‘This surrogate is precisely that rule-governed connectedness of our representations which is reflected in our employment of concepts of empirical objects conceived of as together forming a unified natural world, with its own order, distinct from, and controlling, the subjective order of perceptions. Really, nothing comes within the scope of our experience but those subjective perceptions themselves; so that all that can be really understood by empirical knowledge of objects is the existence of such rule and order among those perceptions as is involved in our being able to count them as perceptions of an objective world, having its own independent order, to which we can ascribe, as a consequence, the order of our perceptions.’ (‘The Bounds of Sense’, p. 104)

– In other words, if I’m reading Strawson/Kant right, our perceptual experiences, being rule-governed and connected, give us empirical knowledge of objects, that is, knowledge of objects of experience, which we can ‘count’ as perception of the objective world.

Humean, All Too Humean

I intended this post to be a bit of reflection on agent causation and free will, but I was led in a more fundamental direction after concluding with Timothy o’Connor that an account of agent causation really depends on the impossibility of a Humean account of causation. This is a rather simple thesis that can be summed up as follows: agent causation (AC) takes as fundamental that causes really do necessitate their effects – let’s call this Real Causality (RC). Humean-ism (H) fundamentally denies that causes necessitate their effects. Therefore, the first step towards an account and defense of agent causation ought to begin with a look at the metaphysics of causation – more specifically, why we shouldn’t take H to be the case.

Tim Maudlin in his excellent volume ‘The Metaphysics Within Physics’ maps out H by way of two doctrines derived from a reading of David Lewis:

‘Doctrine 1 (Separability): The complete physical state of the world is determined by (supervenes on) the intrinsic physical state of each spacetime point (or each pointlike object) and the spatio-temporal relations between those points.’

‘Doctrine 2 (Physical Statism): All facts about the world, including modal and nomological facts, are determined by its physical state alone.’ (p. 51)

Maudlin then takes these ideas to task, drawing arguments against Doctrine 1 from quantum physics. Classical physics was indeed separable – the physical state of the universe is, more or less, determined by spatio-temporal relations, dispositions and properties in space and time. Maudlin spends a fair amount of time doing some pretty fancy math and comes to the conclusion that given quantum theory as a part of a true description of the world (which is a separate but related contention – Maudlin isn’t trying for an instrumentalist or consciousness-based interpretation of quantum theory here), separability cannot be sustained. He arrives here by an exposition of particle systems, spin states and entagled states, which is rather technical.

Doctrine 2 Maudlin takes to be indefensible as well, and I’ll quote him at length here:

‘It matters not whether one starts with Newton, who, in the Principia, simply announces his three laws of motion after giving the definitions of various terms, or whether one turns directly to any contemporary textbook on quantum theory, which will treat, e.g., the Schrodinger equation as a fundamental dynamical principle. Physicists seek laws, announce laws, and use laws, but they do not even attempt to analyze them in terms of the total physical state of the universe or anything else…Unlike reductive analyses of possibility, causality, and chance, reductive analyses of laws are not endorsed by scientific practice.

Indeed, scientific practice seems to preclude such an analysis. As we have seen, physical possibility is easily understood in terms of models of the laws of physics. Let us suppose (and how can one deny it) that every model of a set of laws is a possible way for a world governed by those laws to be. Then we can ask: can two different sets of laws have models with the same physical state? Indeed they can. Minkowski space-time, the space time of Special Relativity, is a model of the field equations of General Relativity (in particular, it is a vacuum solution). So an empty Minkowski space-time is one way the world could be if it is governed by the laws of General Relativity. But is Minkowski space-time a model only of the General Relativity laws? Of course not! One could, for example, postulate that Special Relativity is the complete and accurate account of space-time structure, and produce another theory of gravitation, which would still have the vacuum Minkowski space-time as a model. So under the assumption that no possible world can be governed both by the laws of General Relativity and by a rival theory of gravity, the total physical state of the world cannot always determine the laws. The only way out is either to assert that empty Minkowski space-time must be governed by both sets of laws, since it is a model of both, or (a more likely move) that it can be governed by neither set of laws, since neither is the simplest account of space-time structure adequate to the model (the simplest account is just Special Relativity). But how can one maintain that the General Relativistic laws cannot obtain in a world that is a model of the laws, and hence allowed by them? The necessity of distinguishing the physical possibilities (i.e. the ways the world could be given that a set of laws obtains in the world) from the models of the laws signals a momentous shift from philosophical analyses that follow scientific practice to analyses that dictate it.’ (p. 67-68)

There is no shortage of less physics-based reasons to not be a Humean, however. One might point out that Hume’s conclusions have force only if his empiricism is accepted, and there are many good reasons why that shouldn’t be accepted – modern philosophy is, in fact, partly composed of such rejections (Reid, Sellars, and the rejection of the positivists make up part of this history. The positivists, who claimed that non-analytic statements or statements that go beyond empirical justification are meaningless, are left in a position which doesn’t exactly aid one in the search for the laws of nature. Nor are things such as quarks and their flavors logical constructions out of sense-data). This isn’t to say that a wholesale rejection of Hume is called for – his observation that causation is not empirical is absolutely correct, though not his further conclusion that it doesn’t exist at all, since causation is very real though metaphysical category. But if the foundation for Humean-ism, which is a strict empiricism, isn’t sound, then we have far less reason to accept Humean-ism.

Given this all-too-cursory look at why we might not want to be Humean, what exactly follows? Concerning agent causation, we are left with a good bit of space with which to work, now that the shackles of Humean causation have been loosed – we are free to develop an account of agency and freedom in which agents are real causes of events.

Freedom and Its Human Face

– Timothy o’Connor helpfully distinguishes between the ‘capacity to choose’ and ‘freedom’ – the former is necessary but not sufficient for an account of free will. The latter, interestingly enough, can be diminished without the former being so.

– Crucial in o’Connor’s account of free will are reasons – reasons that are acted on (desires, belief, what have you) and reasons that are acted for (goals).  Reasons are themselves non-causal, since o ‘Connor is defending agent causation, but they are causally influential. To use his terminology, reasons structure the agent-causal capacity.

– Self-knowledge plays a significant role here – if a person is unaware of the factors and reasons which motivate his action, then he has a lesser degree of freedom than someone who has a greater knowledge of the factors motivating his action – the more self-aware person will be able to reflect on his motivating factors and actions

– Another crucial aspect for his account of freedom is the integrity of self-formation. Citing Robert Grosseteste’s angelic thought experiment (where an angel is formed for an instant with a full set of memories and psychological dispositions- which he doesn’t actually have, having only existed for an instant – and then makes a decision or chooses to act) o’Connor concludes that a person’s history (his/her full set of psychological dispositions, previous choices, character, etc) is a source of freedom.

– This conclusion is reached by noting that in the above thought experiment, the instant-existing angel merely has his ‘history’ as a ‘given’, which as such determines how he will act/choose – this ‘given’ is more or less the factors that shape our choices. If one has a real history, then one also has a ‘given’, but this ‘given’, as we grow and choose and interact with our world and are exposed to all kinds of rich new horizons, is shaped in such a way as to reflect more of our own action. Thus, our ‘given’ becomes becomes more of our own creation, and through our actions in the world our freedom grows.

‘We come into the world with powerful tendencies that are refined by the particular circumstances in which we develop. All of these facts are for us merely ‘given.’ They determine which choices we have to make and which options we will consider (and how seriously) as we arrive at a more reflective age. However, presuming that we are fortunate enough not to be
impacted by traumatic events that will forever limit what is psychologically possible for us, and, on the positive side, that we are exposed to a suitably rich form of horizon-expanding opportunities, the structure of our choices increasingly
reflect our own prior choices. In this way, our freedom grows over time.’ – (Timothy o’Connor, ‘Freedom With a Human Face‘)

– Simplifying that out a bit: as a person grows and chooses they shape some of the factors that shape their choice. All other things being equal, this effectively grows our freedom since we shape our ‘given’ by our own choices. Perhaps an argument can be extracted:

If persons have a histories, they are free

Persons have histories

Therefore, they are free

– While not very convincing, this may serve to show the point being driven towards.

Notes on Reconiliation and Redemption in Torrance

– Torrance defines reconciliation as having to do with the repaired or remade relation between God and man/creation, a relation of peace, love and unity. This is effected by God acting in Christ, so while God is the ultimate subject of reconciliation, Christ is the immediate subject.

– Reconciliation is cosmic and reaches out to all things – all things are reconciled, both to God and to each other. Christ is made the head of all things and in Christ all things are reconciled to God. Christ = reconciliation, and those ‘in Him’ live out this reconciliation.

– Redemption is a bit different – redemption here is seen as the destruction of the enslaving powers of sin and liberation from the bondage of sin and death, so that those liberated can become a kingdom of priests in their inheritance. This is clearly eschatalogical.

– There is a very obvious now/not yet dialectic. The ‘now’ = breaking in of God’s kingdom and freedom from sin, and our beginning to live out our vocations as priests in the kingdom. The ‘not yet’ = final state of consummation and final fulfillment of our redemption.

– The whole world is involved in redemption as it is involved in reconciliation. Torrance grounds both of these in God becoming a creature in space and time.

– Creation is under sin/curse, God’s act of redemption frees creation, and now creation waits and ‘groans’ in anticipation of the final eschatalogical redemption.

– Rough summaries: reconciliation is the inbreaking of the kingdom of God thru the removal of enmity between God and man the the establishment of a unity of peace and live. Redemption is the breaking of and removal from the powers of sin and death – man is both redeemed and redeemed into his inheritance and creation freed from bondage.

A Problem for Direct Realism

Here I take a central thesis of a direct realism theory of perception to be the idea that if we are directly aware of objects, and not a sense-datum or idea, then we have to say that things such as colour must be such that reference can be made to them without reference to any subjective or phenomenal experience of perceivers- we cannot reference colour except by way of referencing it as we experience it, ergo phenomenal concepts. However, how can colour be referenced in a way that avoids phenomenal concepts and still be about colour in any coherent way?

John McDowell explains further, referencing J.L. Mackie’s view of primary and secondary qualities (in which experiences of, say, red do not need to be understood in terms of the experiences the red object gives rise to):

‘According to Mackie, this conception of primary qualities that resemble colours as we see them is coherent; that nothing is characterized by such qualities is established by merely empirical argument. But is the idea coherent? This would require two things: first, that colours figure in perceptual experience experience neutrally, so to speak, rather than as essentially phenomenal qualities of objects, qualities that could not be adequately conceived except in terms of how their possessors would look; and, second, that we command a concept of resemblance that would enable us to construct notions of primary qualities out of the idea of resemblance to such neutral elements of experience. The first of these is quite dubious…But even if we try to let it pass, the second seems to be impossible. Starting with, say, redness as it (putatively neutrally) figures in our experience, we are asked to form the notion of a feature of objects which resembles that, but which is adequately conceivable otherwise than in terms of how its possessors would look (since if it were adequately conceivable only in those terms it would be secondary). But the second part of these instructions leaves it wholly mysterious what to make of the first: it precludes the required resemblance being in phenomenal respects, but it is quite unclear what other sense we could make of the notion of resemblance to redness as it figures in our experience.’ (‘Values and Secondary Qualities’, in ‘Essays on Moral Realism’, ed. Geoffrey Sayre-Mccord, p. 169)

I think the following argument can thus be extracted:

Direct realism holds that reference to colour (or any phenomenal quality) can be made apart from phenomenal concepts – or, there is a neutral figuring in experience for colour.

We cannot reference colour except by way of phenomenal concepts – or, there is no neutral figuring in experience for colour.

Therefore, a direct realism theory of perception is false.

Kant and Non-Materialistic Naturalism

Kant is, interestingly enough, concerned to uphold naturalism without materialism. While this seems odd at first blush, his reasons for doing so are fairly interesting and constitute a universally acknowledged important (though to what degree it’s successful is somewhat more in doubt) project. Let’s bracket to the side the fact that Kant has only a small number of not-so-good arguments for his position as well as some serious questions of coherence and see just what happens when we dig through his thought.

In more contemporary terms, metaphysical naturalism generally cashes out to a kind of materialism or physicalism – the only things that there are are material things (or, if we want to Quine things up, whatever we’re committed to by our best theories). It is, at its broadest, non-supernaturalism. The physical, causal order is all there is, in one way or another.

Kant was a naturalist in a slightly different sense: he took everything to be governed by mechanical laws but wanted to resist and undermine the assumption of materialism, which is more or less one of the driving reasons behind his transcendental idealism, which may be best understood as contrasting with its opposite, transcendental realism.

As I see Kant, he means two things by ‘transcendental realism’ (TR). (1) The epistemological thesis that we are fully aware of of the limitations of our own mind and can thus know the things in themselves, and (2) the metaphysical thesis that things exist in time and space apart from human cognition.This is a problem because the mathematical and mechanical laws of nature, on this scheme, govern literally every thing, including the things in themselves – and from this, Kant takes it, follows materialism.

Kant’s idealism needs little introduction, but setting it against TR, we can see that the basic gist is that (1) we aren’t fully aware of the limitations of our mind and can’t know the things in themselves and (2) the objects of our experience, things in time and space, exist as a result of our cognition and conceptual activity.

What this doctrine secures is this: a naturalism without materialism. How? By restricting the mathematical and mechanical laws of nature to the objects of our experience, Kant has protected the things in themselves from being naturalized or material-ized.

Put another way: if we can experience or know the things in themselves, then the universal laws of nature apply to them, because they apply to everything. By restricting our knowledge and experience from the things in themselves, Kant has both secured his naturalism (because the laws of nature apply to everything we experience) and attacked materialism (by showing that the universal laws of nature do not apply to everything).

If Kant is right then, naturalism is correct in the sense that universal laws govern everything we experience – but by restricting this to the appearances, he can both avoid and attack materialism, since the laws apply only to our experience and not to the things in themselves. Thus, while everything we expereince is ‘natural’, not everything is in nature.

Postmodernism, a Failure of Nerve?

‘Postmodernists nearly all reject classical foundationalism; in this they concur with most Christian thinkers and most contemporary philosophers. Momentously enough, however, many postmodernists apparently believe that the demise of classical foundationalism implies something far more startling: that there is no such thing as truth at all, no way things really are. Why make that leap, when as a matter of logic it clearly doesn’t follow? For various reasons, no doubt. Prominent among those reasons is a sort of Promethean desire not to live in a world we have not ourselves constituted or structured. With the early Heidegger, a postmodern may refuse to feel at home in any world he hasn’t himself created.

 Now some of this may be a bit hard to take seriously (it may seem less Promethean defiance than foolish posturing); so here is another possible reason. As I pointed out, classical foundationalism arose out of uncertainty, conflict, and clamorous (and rancorous) disagreement; it emerged at a time when everyone did what was right (epistemically speaking) in his own eyes. Now life without sure and secure foundations is frightening and unnerving; hence Descartes’s fateful effort to find a sure and solid footing for the beliefs with which he found himself. (Hence also Kant’s similar effort to find an irrefragable foundation for science.)

Such Christian thinkers as Pascal, Kierkegaard, and Kuyper, however, recognize that there aren’t any certain foundations of the sort Descartes sought—or, if there are, they are exceedingly slim, and there is no way to transfer their certainty to our important non-foundational beliefs about material objects, the past, other persons, and the like. This is a stance that requires a certain epistemic hardihood: there is, indeed, such a thing as truth; the stakes are, indeed, very high (it matters greatly whether you believe the truth); but there is no way to be sure that you have the truth; there is no sure and certain method of attaining truth by starting from beliefs about which you can’t be mistaken and moving infallibly to the rest of your beliefs. Furthermore, many others reject what seems to you to be most important. This is life under uncertainty, life under epistemic risk and fallibility. I believe a thousand things, and many of them are things others—others of great acuity and seriousness—do not believe. Indeed, many of the beliefs that mean the most to me are of that sort. I realize I can be seriously, dreadfully, fatally wrong, and wrong about what it is enormously important to be right. That is simply the human condition: my response must be finally, “Here I stand; this is the way the world looks to me.”

There is, however, another sort of reaction possible here. If it is painful to live at risk, under the gun, with uncertainty but high stakes, maybe the thing to do is just reduce or reject the stakes. If, for example, there just isn’t any such thing as truth, then clearly one can’t go wrong by believing what is false or failing to believe what is true. If we reject the very idea of truth, we needn’t feel anxious about whether we’ve got it. So the thing to do is dispense with the search for truth and retreat into projects of some other sort: self-creation and self-redefinition as with Nietzsche and Heidegger, or Rortian irony, or perhaps playful mockery, as with Derrida. So taken, postmodernism is a kind of failure of epistemic nerve.’ (Alvin Plantinga, ‘Warranted Christian Belief)

Incommensurability and Private Language

– David Bohm argues in his talk in ‘The Structure of Scientific Theories’ that terms in a given scientific theory only have meaning within the context given by that theory. This can probably be called ‘strong incommensurability’ – no two theories seem to be able to talk to each other.

– What this leaves us with is a kind of private language for science – private theory language. If the terms in a theory have their meaning only within the context of that theory, then it would seem that, as far as theories are concerned, scientists are unable to talk to each other. Given, however, the fact that scientists do talk to each other (and sometimes even about each others theories) there must be a snag somewhere.

– Bohm’s solution (and he later acknowledges that though it looks as if he’s advocating a kind of solipsism, he’s not) is to try and show that until a kind of common language can be adopted, confusions will continue to crop up in theory development. He cites a number of scientific cases from quantum mechanics where confusion abounds. Some familiar examples might be von Neumann, Kepler/Newton, etc.

– I think it’s fair to here identify Bohm to be paying tribute to the positivist tradition (Carnap et al) in his effort to move from ‘private theory language’  to a common kind of language – a project which saw a large reaction in 60’s and 70’s philosophy of science, especially in the area of theory-laden observation, which attacked the idea that there is in fact even neutral sensory data and neutral language to translate a theory from and into.

– Despite significant confusions in science (Bohm is correct to identify this) it seems a bit shaky to assert that this is both something to assert that this confusion is something to be avoided at all costs by adoption of a more neutral language (even though a Wittgensteinian picture of language may be of help here). Such confusions are only a strict problem if they stem only from theories not being able to talk to each other and do nothing to advance science – and quite often, these confusions help to sharpen, clarify and discard theories and concepts and so help science to advance forward.

– An example here that Bohm cites is malaria – which, throughout history, has had many different theories formed about its origin, structure, spreading, etc. Bohm notes that every different theory here is incommensurable – theorized causes ranged from bad air, damp air, etc, which all seemed to be confirmed by the data – and that effectively, each theory had nothing in common other than the fact that each dealt with malaria.

– In rebuttal, Robert Causey argues that far from demonstrating strong incommensurability, this merely shows that some theories are harder to falsify and some easy to confirm. The current (correct) theory of malaria makes sense of the same data as earlier, more primitive theories (damp air, bad water, etc) – Causey more or less argues that the history of malaria shows that, far from being incommensurable, these theories dealt with the same problem and the same data. Causey further argues that to show the kind of incommensurablity that Bohm is driving at, Bohm would have to show that (1) the problems dealt with by the different theories really were different problems with only the mere appearance of being the same (2) that the terms used by the different theories really were different and (3) that the differences in these terms and their meanings are different enough to show that the problems the theories were dealing with really were different problems.

– This, though a crude sketch, shows that incommensurability requires a fairly high burden of proof if it’s going to be asserted in as strong of a form that Bohm asserted.