In order to realistically understand
the subject of statistics
it is important to
appreciate the rationale behind why
and how statistics is used by the world, at large.
That is, why do we need statistics anyway?
This, perhaps, is a bit
philosophical, yet I can not over emphasize
the need for thinking along these lines. Without
proper perspective, statistics
becomes a mere a mathematical exercise, diverging
from the true nature of the subject.
In order to begin our analysis as to why statistics is
a necessary type of reasoning we must begin by addressing
the nature of science and experimentation. A characteristic
method used by Scientists is to study a relatively small collection of
objects, say 2500 people, and a characteristic, say longevity,
and through experimentation
or observation, draw a conclusion appropriate for
the entire class of objects (i.e., people, in general).
suppose a study published results suggesting
people who own pets live longer.
Would this mean that
all people who own pets
are likely to live long lives?
Does owning a pet cause longevity?
Suppose the people in the study, by chance, were on the
whole, very healthy people, and therefore lived long lives:
Would this invalidate the researcher's assertion that people who own
pets live longer?
The obvious problem with this type of reasoning is that these issues
can never be proved absolutely. This type of scientific reasoning is called
inductive reasoning and is inherently flawed. One can never study a
sample and expect conclusions to hold true for the entire population with
absolute certainty. This is exactly why statistics is needed.
In contrast to the lack of certainty associated with inductive
the type of logic used in Mathematics is absolutely certain. The
mathematician begins with general principles and logically concludes
more specific relationships.
This type of reasoning from the general to the particular is called
A rather simplistic
(but nevertheless correct)
example is based on the principle that two numbers can be added in any
order, thereby giving the same sum.
This is called the axiom of commutativity.
An example of deductive reasoning would
be to assert that since this holds for any two numbers, surely this
must hold for
two and three, in particular. We are, therefore,
absolutely certain that
2 + 3 = 3 + 2, given the axiom of commutativity.
In its applied form,
then becomes a bridge between the inductive uncertainty
of science and the deductive certainty of mathematics. In his classic book,
The Design of Experiments,
Sir Ronald A. Fisher expresses this idea beautifully:
We may at once admit that any inference from the particular to the general must be attended with some degree of uncertainty, but this is not the same as to admit that such inference cannot be absolutely rigorous, for the natureand degree of the uncertainty may itself be capable of rigorous expression.
Statistics, therefore, is the mathematical method by which the
uncertainty inherent in the
scientific method is rigorously quantified.