The Laws of Nature and The Nature Of Laws
Laws as Realities and Laws as Theories
The word 'law' has a rather subtle double-meaning.
I will split it into the law-qua-theory and the
law-qua-governing-principle.
A law-qua-theory is one scientist's formulation of the
way a general class of events is interelated,
such as Newton's (as opposed to Einstien's) law
of gravity. A law-qua-governing-principle
refers to an out-there
aspect of reality, which laws-qua-theory attempt
to encapsulate. Laws-qua-governing-principle
underpin the accuracy of the law-qua-theory in predicting future events.
The law-qua-theory is analogous to the map,
the law-qua-governing-principle to the territory.
It is common to think of laws-qua-governing-principle
as having a determining effect, as
requiring that, given a certain set of physical
circumstances, only one outcome is possible.
However a law-qua-governing-principle can govern
without determinining, by stipulating that a
range of outcomes is possible , and anything else
impossible. This is rather like the rules of chess which
stipulate only certain moves as permissible, but do
not constrain the player into having only one permissible
move (it would not be much of a game if they did!).
It is also like the laws of a free society which tell
citizens what they must not do, but do not stiuplate what they must.
The above is not merely speculative, either: if quantum mechanics
is as correct as it seems to be, the fundamental 'laws' of the
universe are of a governing-but-not-determining nature.
If they are, this does not, as is often mistakenly supposed,
undermine the predictive abilities of physical science,
since it is still possible to confirm theories
by probablistic, statistical methods.
When causal determinists are not being simply dogmatic,
they appeal to science to demonstrate that the universe
is governed by laws. Such laws cannot be laws-as-theory,
mere maps, since maps do not guarantee that the future is fixed.
They must be laws-as-governing-principles, and *determining*
ones at that. This means they are faced with two problems:
that of showing that laws-as-governing-principles are constraining
to the point of being determining, despite what quantum mechanics says;
and that of explaing what kind of entity, ontologically speaking,
they actually are.
They are surely not spatio-temporal entities composed of matter-energy.
The law of gravity is not itself some kind
of material body floating about in space. Thus, determinists must
be commited to a belief in immaterial non-sptaio termporal entities
of some kind, which is rather unfortunate for them, since suspicion
about such kinds of entities often underpins their scepticism about
the voltional powers of the mind.
Peter D. Jones 08/9/05
Laws, Necessity and Pattern
The idea of physical laws being eternal, unchanging, Platonically outside of
space and time, rigidly binding on all events is certainly a fairly common way
of conceiving physical alws, but familiarity is not necessity.
It *is* logically necessary that the succession of states of the universe
is either:
patternless (i.e. random) or patterned.
Now, we can stop there and simply describe the pattern without attributing it
to any kind of law, but that seems unsatisfactory; presumably, thre is
a reason for the pattern. So now we have:
patternless, or patterned for no reason, or patterned for a reason.
Now, does the reason for the patterning have to be an unchanging system
of eternal laws ?
Suppose an toy universe has only two states available to it, A and B, and
supposing the observed pattern is
ABABABABABAABAABAABAABAABAABAABAABAAABAAABAAABAAABAAABAAABAAABAAAB
Initially, what we see here is a pattern following a rule --
A's are followed by B's, and B's are followed by A's --,
but if we look at a further set of data, we see that the rule
itself is evolving -- perhaps according to a meta-rule.
So we are not forced into an either-random-or-determined dichotomy. The actual
logical necessity is something like:
random
or
patterned for no reason
patterned for a not-fully-constraining reason
patterned for a fully constraining reason
Peter D. Jones 31/8/01
The Problems of Induction.
Induction is the epistemological process of deriving a general conclusion from
a limited set of data.
Induction is needed to found laws, and laws are, by most accounts, needed to
found causality.
It is not purely logical in the sense of making deduction from premisses.
In its strongest form the problem of induction is the problem of deriving a statement
to the effect that "All A are necessarily F" from some only a few examples of A which are F.
If it cannot be justified deductively -- how can it be justified ? Deduction is the gold standard.
If we allow sui-generis forms of justification, do we not open Pandora's box ?
Possibly, nothing can be justified deductively becasue of the trilemma problem.
Munchhausen trilemma -- justification-of-justification is either
circular, infinitely regressive, or arbitrarily curtailed
Empirical, aposteriori justification fails also. We cannot directly verify a universal ("for all")
statement. Specific examples fail. Terry the Turkey thinks December the 25th is going to be like
every other day. Appealing to the ultra-general meta-law that the future will be like
the past, and that the same cause-effect relations will continue to hold just because they
have held is circular (at least if the meta-law is to be derived inductively.)
In its strongest form the problem of induction is the problem of deriving a statement
to the effect that "All A are necessarily F" from some only a few examples of A which are F.
Attempted solutions tend to relax either
- the requirement for universality, or
- the requirement for necessity
IOW, there are several weak forms of induction
- "Most A are necessarily F"
- "All A are probably F"
- "Most A are propbably F"
Induction and exceptions. White ravens and black swans.
TBD
Kant and Hume on Causality.
Kant's dilemma is that he
wants to get rid of the old scholastic metaphysics with its absurd and
unproveable dogmas, but he also needs a new metaphysics or a metaphysics-substitute in order
to support scientific and common-sense reasoning. The principle reason he believes that
he needs a new metaphysics or a metaphysics-substitute, is that he needs to support the
idea of causality, which is seemingly needed for Newton's new science, and in particular
he needs to support it against the sceptical attack of Hume.
Hume's contention that causality cannot be affirmed by the senses starts with the objection
that it is not a sense-datum like colour or size. It seems possible to get round this
by saying, in modern language, that causality is a higher-order datum, that we can confirm
casuality by noting repeated patterns of events. This will not do for Hume and Kant,
however, because for them causality implies strict necessity, and even repeated empirical
observation cannot deliver logical necessity. What is more, the place one would normally
look to for logial necessity, abstract logical argumentation, cannot deliver necessity in
the case of causality. 'Every event has a cause' is not analytically and necessarily true
in the way that 'every child has a parent' is. Kant's way of getting free from the two
horns of this dilemma is to introduce a third option, the famous 'synthetic a-priori',
which delivers necessity without being analytical, and relates to experience without being derived
from it. The crux of the issue. leading to the need for the complex edifice of Transcendental
Idealism and the Synthetic A Priori, is the *necessity* of causality, which was at the time,
an invariable assumption, made by Kant, Hume Spinoza, and so on.
Kant claims that an apriori element is needed for causality. Surely he doesn't mean that
we know, apriori that particular events follow on other events; we don't organise events
into a causal order like a man arranging a pack of cards. But if the apriori nature of causality
is only an apriori concept, not apropri knowledge, how does it give a basis for objective casuality
at all. After all, the mere fact that we have a concept is not guarantee that it is applicable
objectively (for instance, the Euclidean concept of space we have is not born out in physical
reality).
In view of Kant's statements that empirical data are needed to establish the particular laws of nature, what does he mean by "necessity" , in his claims that apriori input is needed to give natural laws their necessity?
He may well mean, in accordance with the assumption of his day, that effects follow on their causes inevitably.
"Necessity" surely does not mean epistemic necessity -- if that were the case, laws would be deducable
rationalistically, and no empirical input would be needed. It does not mean necessity in the sense
of truth accross all possible worlds either -- again the empirical data from this world
are used to establish the natural laws of this world. Kant's paradigm example of natural laws
with necessity as opposed to those without is the Newtonian account of the solar system
as opposed to the Keplerian account. The Newtonian account is necessary in that an account of
the details of the motions of the planets (corresponding to the "merely empiricial" Keplerian model)
can be deduced from a set of assumptions which apply to everything in the universe.
Thus the Kantian notion of necessity, as it applies to natural law, seems to have two ingredients:
- Uniform applicability to everything in the universe
- Hypothetical deduction of possible observations from theoretical assumptions
Cast in this form, it is not difficult to see why Kant is so insistent that natural laws require "necessity";
without (2) there is no possibility of answering hypothetical questions or making predictions. Laws lacking
(2) would be "merely empirical" -- just records of observations. (Of course laws are still empirical in that
we need to examine the empirical data in order to arrive at assumptions which predict them.)
And without (1), we could not esablish (2).. laws would be subject to arbitrary exceptions, and no reliable
predictions or deductions could be made.
However, Kant clearly thinks that a stronger form of necessity, necessity in terms of of inevitable consequence is needed.
Possibly the insistence on inevitability follows from the requirement for unformity. kant would not have been
the first or last person to see indeterminism as incomprehensible deviations from a law. However, this is a mistake.
Many scientific laws are formulated probalistically, and the idea of a deviation from such a law
(for instance a die that always comes up six) is intelligible -- so the idea of indeterminism
without deviation is comprehensible.
Thus an apriori notion of causality can play a useful role; it shows where we get the very idea of causality from
(accepting Hume's point that it cannot be derived from empirical observations of succession alone), even if all
we are to do with it is to apply it hypothetically and speculatively.
It could be objected to Hume's critique of causality that we can infer a normative law of nature -- a law
to the effect that B's will follow on from A's -- purely by noting that in the past B's have indeed followed on from A's.
However, to make the transition from the descriptive to the normative, it is necessary to appeal
to a meta-law: if a regularity in nature has been noted in the past, it will be repeated in the future.
Thus Hume's point remains valid: purely empirical observation cannot be used to infer normative laws.
However, the objection illustrates how a completley general principle of causality can bring something useful
to the table, namely a way of deriving specific laws from specific, non-normative data. In other
words, an apriori concept of causality combines with aposteriori data to produce synthetic, contingent
natural laws. But it is still only an apriori concept, not a apriori knowledge.
It can, however gradually approach the condition of knowledge in application. As more and better specific laws
are formulated, the general principle of causality that 'grounds' them is vindicated and justified. If causality
did not apply at all to the universe, we would not be able to formulate causal laws. But the process
of vindication is never completed, so epistemic necessity is never obtained by the principle of causality.
What we have here is effectively a third epistemic category. As knowledge, causality is not apriori,
nor aposteriori (in the sense that empirical data are aposteriori), but is more-that aposteriori. It is
justified pragmatically, by its ability to serve as the basis of a system of knowledge, and its
level of justification grows open-endedly along with that knowledge. (However, this does not
stop a general principle of causality being apriori as a concept).
Must Popperians Completely Reject Induction?
(Background: Humean scepticism about causality, and
Popper's falsificationist conjecture-and-refutation replacement for
induction).
The general principle of induction/causality -- or a probalistic version of it in view of QM -- can
be supported by Popperian conjecture-and-attempted-refutation if anything can. So Popperians
are not obliged to reject induction entirely. This does not solve
the "strong" problem of intduction because it means that causality/induction is only a refutable
conjecture, not a necessary truth.
All inductive arguments are founded on the principle of induction
which is founded on C&AR, so necessary truth is not obtained.
Nonetheless there is nothing to stop Popperians saying that
fires will probably continue to cause smokes *because* fires in the past have done so.
THe "if it has happened before, it will (probably) happen again" rule,
the weak form of the general principle of induction, can be supported by C&AR, meaning
that individual law-like generalisations don't have to be (any more than they are alaready,
by inheritance as it were)
Popper points out that science does not work by observation
alone. How do you know what to observe ?
You need a conjecture.
But it there is equally a question of what to conjecture. That
is as much worth explaining as anything else.
Induction can explain why some conjecures are prima-facie more plausibility than others. The
conjecture "banging your head on the wall causes headaches"
is better supported than the conjecture "banging your head on the wall cures cancer",
even before any attempted refutation takes place. Science procedes more
swiftly and surely than blind trial and error -- because its hypotheses
have a greater-than-chance plausibility. This breaks the analogy with
evolution, howver. Perhaps it was a silly analogy, as scientists are much smarter than genes.
(Perhaps it isn't becasue genertic mutations are mutations of other genes
which have already been done a job succesfully).
What is going to have a hard time is anything involving
once-and-for-all certainty. That includes once-and-for-all falsification
as well as once-and-for-all justification. As Martin Gardner points out,
the Cosmological Constant has been in and out and back in again.
Note: Popperians need a principle of demarcation because induction cannot
be rationally justified; they need to demarcate scientific rationallity from other
kinds.
Peter D Jones,Sussex,5/7/05
Exceptionless laws, or averaged-out approximations?
Historically, the most obvious way of combining the two (such as treating gravity as simply another particle field) ran quickly into what is known as the renormalization problem. In the old-fashioned understanding of renormalization, gravity particles would attract each other and adding together all of the interactions results in many infinite values which cannot easily be cancelled out mathematically to yield sensible, finite results. This is in contrast with quantum electrodynamics where, while the series still don't converge, the interactions sometimes evaluate to infinite results, but those are few enough in number to be removable via renormalization.
In recent decades, however, this antiquated understanding of renormalization has given way to the modern idea of effective field theory. All quantum field theories come with some high-energy cutoff, beyond which we do not expect that the theory provides a good description of nature. The "infinities" then become large but finite quantities proportional to this finite cutoff scale, and correspond to processes that involve very high energies near the fundamental cutoff. These quantities can then be absorbed into an infinite collection of coupling constants, and at energies well below the fundamental cutoff of the theory, to any desired precision only a finite number of these coupling constants need to be measured in order to make legitimate quantum-mechanical predictions. This same logic works just as well for the highly successful theory of low-energy pions as for quantum gravity.
wikipedia
Philosophically, this seems to mean that phsyics doesn't work if it's laws are treaed as exeptionless and universal, but only if they are treated as approxiamtions which cover special cases.