JJ

					                     BLACK-BODY PROBLEMATIC:
                      GENESIS AND STRUCTURE
I wish to thank all my teachers at the Department of Sociology, Delhi School of Economics,
especially Prof. Jit Singh Uberoi for having taught the truth and method of science. But for him I
could not have produced this paper. Sam Schweber provided a much needed encouragement by
sending in fruitful comments on the paper, all of which, unfortunately, I was unable to
incorporate. Finally, a heartfelt thanks to my students at Hindu College for welcoming me with a
deluge of love and affection.

In the history of physics, the black-body problematic played a crucial role in inaugurating a
fundamental innovation in scientific thought—the quantum innovation. The word problematic
has been used as Deleuze uses it – “the problematic refers to the ensemble of the problem and its
conditions of realization” (1994:177). In order to reconstruct the history of the black-body
problematic, I will look at the specific discursive configuration which actualized the conditions
necessary for its emergence and realization. In this process Kuhn’s theory of history of science
will be re-examined and contrary to Kuhn’s argument that rival paradigms emerge successively
via scientific revolutions, I propose to show that rival paradigms coexist simultaneously and it is
the consensus around each and the contradictions between them that produce innovations in
science. My critique of Kuhn’s general theory will be carried out on the basis of his own
monograph on the black-body radiation problem (Thomas S. Kuhn, Black-Body Theory and the
Quantum Discontinuity, 1978).


I. The Problematic And The Framework
In this article I try to argue that the theoretical derivation of black-body radiation should be
located between three rival paradigms of physics—mechanics dealing with ponderable matter,
Maxwell’s theory of electromagnetic field and thermodynamics dealing with heat. There was an
unreconciled tension between these three and the black-body radiation problem promised a
solution to the union of electrodynamics, mechanics and thermodynamics. Attempts were made
by Boltzmann to reconcile thermodynamics with mechanics through kinetic theory of gases.
Planck sought to reconcile thermodynamics with electrodynamics through the theory of thermal
radiation. That still another element linking mechanics and electrodynamics was missing was
realized first by Einstein who suggested that efforts to reconcile Newtonian mechanics of
discrete particles and Maxwellian electrodynamics of continuous waves might involve the
fundamentals of physics. Boltzmann’s efforts led him to create the subject of ‘statistical
mechanics or statistical thermodynamics’ which Planck followed in his derivation of the black-
body radiation law to which Einstein added the concept of light quanta. The simultaneous
presence of three rival paradigms, not the successive emergence of unique paradigms, outlined
the production of the quantum novelty in physics. In other words, it was precisely because the
theoretical derivation of black-body radiation encapsulated a few fundamental contradictions of
physics of the time that it was extremely well suited to inaugurate the quantum innovation.
Contrary to Kuhn’s general methodology, his own case study of black-body radiation does not
present classical physics as a monolithic body of doctrine within which the radiation law could
be established. In fact, at any point in the history of radiation research it can be shown that a
diversity of approaches involving different world views marked its treatment. From the
beginning when Kirchoff established the form of the radiation distribution function, F(l,T) ,
attempts to derive the function saw the coming together of various lines of thought which led to
new advances in radiation research. While Lommel (1878) based his understanding on a
mechanical model; Michelson (1887) used arguments from kinetic gas theory including
Maxwellian statistics of velocity distribution; Wien’s (1896) deduction was based on the second
law of thermodynamics and molecular assumptions; Planck brought to his new radiation law
(1900) many world views, the mechanical, the electromagentic and the thermodynamical,
including Boltzmann’s combinatorials; Lorentz (1903) used the electron concept; Jeans (1905)
applied the equipartition theorem and Einstein (1905) took up the notion of energy fluctuations
to understand black-body radiation. What emerges from these various attempts is the recurrence
of the opposition between three theoretical frameworks—mechanics, thermodynamics and
electrodynamics; the opposition between two principles, reversibility and irreversibility; the
opposition between two types of laws—dynamical and statistical; and the opposition between
macroscopic and microscopic models. The oppositions, dualisms, contrasts or inversions
between theories, principles, laws and models form the framework of the black- body problem
and the subsequent quantum innovation.

For the argument that follows it is important to remember that for Kuhn (1970), a paradigm
defines a field of science and brooks no rival. Kuhn highlights the existence of unequivocal
paradigms in order to argue that the succession of paradigms via revolutions is the usual
developmental pattern of any mature science. His main point is that work within a well-defined
paradigm is more productive of revolutionary episodes of scientific innovations, than work in
which no similarly convergent standards are involved. However, as the thermal radiation case
proves, a single, homogenous paradigm is quite incapable of generating any revolutions in
science. Since opposition, dualism, differentiation and reflection is the method of production of
scientific innovation, the concept of universe of discourse is useful in uncovering the field of
operation and articulation and which renders them meaningful and efficacious.

I have suspended a purely sequential reading of the black-body narrative in favour of a structural
account, that would include the notion of transformation through oppositions. It will be within a
‘structural’ conception that I work out the genesis of this specific problematic in physics.
Cassirer will be our guide, to some extent for he has followed the nature of the quantum
development in his book Determinism and Indeterminism in Modern Physics (1956).


Alternative                    Solutions                  And             Competing
Problems
Cassirer argues that in the nineteenth century the particular problem that gave rise to a new
development of thought was the second law of thermodynamics and its conflict with the
prevailing mechanical world view. The law was first formulated by Sadi Carnot in 1824 who
established that heat always passes from a hot body to a cold one when work is done in a cyclic
process. The law was revised and reformulated by Claussius (1865) in terms of the concept of
entropy and he restated the two fundamental laws of the mechanical theory of heat as :

   i.   The energy of the universe is constant.
  ii.   The entropy of the universe tends to a maximum.

Cassirer writes that from a law of thermodynamics, entropy increase was translated as a general
principle of nature, the Principle of Dissipation and Disorder, which contradicted the idea that
the world is cyclic and may go on forever in the same way. The second law expressed the
irreversibility of physical phenomena and thus contradicted the presuppositions of classical
physics which are time reversible. He argues that “the entropy law thus came to be a kind of
irrational remainder, a foreigner and intruder in the securely articulated system of classical
mechanics and electrodynamics” (1956:76). It was this conflict which unsettled the status of the
mechanical world view.

The opposition between the two principles, reversibility and irreversibility, came into view
through the laws of mechanics versus the laws of thermodynamics. What this shows is that the
initial opposition proves difficult to be ‘resolved’, it gets transformed and dissipated across
different frameworks. The thermodynamic principle of irreversibility, as an indicator of
evolution giving a mathematical expression for the ‘arrow of time’, was in contrast to the laws of
mechanics and electrodynamics. The differential equations of Newtonian mechanics are
invariant under time reversal and the motion of individual mass points can be reversed simply by
giving a minus sign to their reversed velocities; similarly, Maxwell’s equations of
electrodynamics too are time reversible.

Alternatives to mechanism were proposed and hotly debated during the 1880s and 90s.

At one end were the energeticists (Mach, Ostwald, Helm) who challenged mechanics, atomism
and the autonomy of the second law. At the other end were Claussius, Helmholtz, Boltzmann and
others who were attempting to provide a mechanical proof of the second law by appealing to the
microscopic structure of matter and formulating laws of probability to bridge the conflict
between macroscopic (thermodynamic) irreversibility and microscopic (mechanical)
reversibility of molecular motions. Since in the communication of heat energy between
innumerable molecules, so small individually and so irregular in their distribution, it is
impossible to follow individual molecule motion as one would in a strictly dynamic theory, it
was necessary to introduce statistics to explain the second law. Cassirer argues that in “viewing
the entropy law as a probability law there had been introduced into the very concept of law a
dualism wholly foreign to its original meaning” and the introduction of statistical laws at par
with dynamic laws was the subject of debate and controversy. For the probability laws were not
seen as having “the same epistemological quality and ‘dignity’” as dynamic laws which were
regarded as absolute laws of nature excluding every exception. “But precisely this property
would have to be surrendered, if one were to go over to mere probability laws. An event no
matter how improbable is still not an impossible event; not only can it occur, but it will in
general occur one day, if we but extend our observations over a sufficiently long period of time”
(1956:77).
The tension between dynamic and statistical laws is most easily visible in the 1860’s when
Boltzmann and Claussius were trying to give a strictly mechanical interpretation of the second
law while Maxwell insisted on the statistical character of the second law. Maxwell’s argument
took the form of what is known as ‘Maxwell’s Demon’, that while it would require the action of
the demon to produce an observable flow of heat from a cold body to a hotter one, this process is
occurring spontaneously all the time on a submicroscopic scale, perfectly consistent with the
laws of mechanics. Maxwell concluded that the chief end of the demon was to show that the
second law of thermodynamics has only a statistical regularity and not a dynamical certainty for
systems composed of large number of small molecules. His idea caught on quickly in his circle.
William Thomson and P.G. Tait were convinced that the truth of the second law is of the nature
of a probability and not an absolute certainty. This led to discussions on the possibilities inherent
in the use of statistics to thermodynamical applications and it was Boltzmann who developed
precisely how the second law is related to mechanics creating in the process the subject of
‘statistical mechanics’ or ‘statistical thermodynamics’.

One can see that solutions to these oppositions were no longer possible within any one
conceptual framework, so they were displaced across frameworks. My intention is to show that
the combination of these oppositions together with the potential of transformation that such a
combination involves led to new advances in scientific thought. In the following pages I will take
up the debate between Boltzmann, Planck and Einstein as a framework to locate the problematic
of black-body radiation. The debate which occurred between these three scientists, presenting
three different relations of two terms each, through the rival theoretical frameworks of
mechanics, thermodynamics and electromagnetism, reflects not just a concern with disciplinary
boundaries; each side of the debate presented alternative solutions to competing problems. The
debate also highlights the spirit of competition present in scientific work. Ironically, in the
scientific community, one’s ‘enemies’ are more significant than one’s ‘friends’, for the scientist
has to prove the validity of his theory more to his enemy than to his friend. The debate between
Planck and Boltzmann was particularly acrimonious with first Planck accusing Boltzmann of
wasting his energies on the atomic-kinetic gas theory and later Boltzmann attacking Planck’s
proof of irreversibility of radiation. Therefore, contrary to Kuhn’s standpoint, absence of a
paradigmatic consensus did not imply that there was no communication or debate or progress.
Dialogue in science is not always about agreement or how to come to an agreement but also
about how to disagree and how to come to a disagreement. The main participants do not seem to
be ‘talking through one another’ in the manner characteristic of Kuhnian paradigm shifts –
Boltzmann, Jeans, Lorentz and Einstein were as aware of the exact nature of Planck’s proposal
as he was aware of the exact nature of their objections. What allowed the different parties to the
debate to enter into a mutually intelligible dialogue and holding different points of views was a
universe of discourse which regulates, and not constitutes, what can be said and how it is to be
said.

(A) Thermodynamics and Mechanics—Boltzmann

The idea of treating energy as a discrete variable rather than a continuous one was first put
forward by Boltzmann in 1872 when he set out to provide a mechanical proof of irreversibility
through kinetic gas theory. He argued that from any arbitrary initial distribution of molecular
velocities, the effect of molecular collisions must always be to bring the gas to an equilibrium
distribution function, the Maxwellian distribution function. His approach was concerned not so
much with thermal equilibrium itself as with the irreversible processes by which equilibrium is
reached. The result (1872) was his H-Theorem: for non-equilibrium states H is proportional
(with a negative proportionality) to entropy; H tends to decrease to a minimum as entropy
increases to its maximum value. Once these extreme values are attained, corresponding to the
state of thermal equilibrium, the system will stabilize and H will remain constant.

It must be emphasized that while his derivation of the irreversible increase of entropy made use
of the statistical distribution of molecular velocities and energies, the result he asserted was
supposed to be a certainty and not a probability. The H-Theorem said that entropy would always
increase and he presented his theorem in a very deterministic phraseology. The essentially
statistical premises of his derivation seemed to vanish without a trace from its results which are
shown to have mechanical implications. The juxtaposition is what creates the tension in his text
and only reiterates the contradictions. However, Boltzmann was convinced that he had supplied a
mechanical proof of the second law of thermodynamics and solved the problem of irreversibility.

In 1876, Joseph Loschmidt argued that no such proof could be valid because all the laws of
mechanics are reversible in time. For every mechanically possible motion that leads towards
equilibrium, there is another, equally possible, that leads away from equilibrium and is thus
incompatible with the second law. This statement presents what has since been known as the
‘reversibility paradox’. Loschmidt concluded that the H-Theorem could not be a deterministic
theorem because there were some initial conditions from which H could for a time increase and
entropy decrease. Boltzmann conceded that Loschmidt was quite correct in asserting that entropy
decreasing processes existed and entropy decrease depended on special initial conditions. He
responded with a statistical interpretation that for some unusual initial conditions it is possible
entropy might decrease (and H increase) as time progresses. But those cases are, he wrote,
extraordinarily improbable and for practical purposes may be regarded as impossible.

Provoked by Loschmidt’s criticism, Boltzmann worked out a ‘combinatorial’ definition of
entropy (1878). He defined a distribution of molecules over finite cells (in the configuration
space of one molecule) as the number of molecules in each cell, and a complexion as the
specification for each molecule of the cell to which it belongs. The probability of a given
distribution was taken to be proportional to the corresponding number of complexions. Since in
the distribution of molecules, all have the same . priori probability (the number of particles in a
cell, not their identity, is relevant to the definition of a micro-state), entropy is given as the
number of distributions compatible with a given macro-state. The global macro-state appears not
only as the most unique state but also as the most likely to occur and, in the sense that it could be
achieved in the largest number of ways, as the final state towards which any evolution will lead,
starting from an arbitrary initial state. In this way, he made entropy a measure of disorder— the
tendency towards increasing entropy is simply a tendency towards increasing disorder. Moving
beyond Maxwell who had argued that the second law had only a statistical certainty, Boltzmann
made the second law a direct expression of the laws of probability: the entropy, S, is proportional
to the logarithm of the probability of that state,

S = . log(W). He recognized “how intimately the second law is connected to the theory of
probability and that the impossibility of an uncompensated decrease of entropy seems to be
reduced to an improbability” (in Klein,1973:73). But the improbabilities involved in the entropy
decreasing processes were tantamount to an impossibility. He chose Thomson’s example to
illustrate that one should not expect a mixture of nitrogen and oxygen gases separating in a
container after a month with pure oxygen in the lower half and nitrogen in the upper half of the
container, even though from the viewpoint of probability theory that outcome is only extremely
improbable, not impossible.

Thus, Boltzmann achieves a subtle reconciliation between macro-irreversibility and micro-
reversibility by developing a statistical theory of thermodynamic entropy of a system composed
of a vast number of molecules even though the individual molecular motions are described by
the reversible laws of mechanics. There is a convergence of the two approaches, the statistical
and the dynamical, to the theory of thermal equilibrium. However, the spheres of application of
statistical and dynamical laws are separated for Boltzmann in a simple fashion. The statistical
procedure is applied solely to the formulation of initial conditions whereas the further course of
events is regarded as being governed by strict dynamic laws.

Boltzmann’s probabilistic and microscopic interpretation of entropy was not accepted easily
when presented in the last quarter of the nineteenth century. Continuum sciences such as
electromagnetism, thermodynamics and acoustics were the vogue. The energeticists were bitterly
opposed to atomistic models because they considered energy rather than matter as the final
reality. Moreover, difficulties appeared with Boltzmann’s attempt at bridging the gap between
thermodynamics and mechanics through probability laws that could only be laboriously removed
by introducing special hypothetical assumptions. One such hypothesis introduced was that of
‘molecular chaos’ which implied that certain actual configurations of molecules within
individual cells be prohibited, either initially or as the motion proceeds, configurations which the
laws of mechanics taken alone would otherwise allow.

Another objection to the H-Theorem came in 1896 from Ernest Zermelo, Planck’s assistant in
Berlin, who developed what has since been known as the ‘recurrence paradox’. Applying a
mathematical theorem published by Poincare five years ago, Zermelo argued that any
mechanical system confined in a finite region of space would after a sufficiently long time
ultimately return to its initial configuration. This contradicted not only the H-Theorem but also
any kinetic theory of heat because if thermodynamics is thought to obey mechanical laws on the
microscopic scale, entropy should behave periodically rather than monotonically. “In such a
system”, Zermelo wrote, “irreversible processes are impossible” (in Kuhn,1978:26). Thus,
mechanics could never be the basis of physics for it dealt only with cyclic processes. Such
cycles, according to the second law, were not natural at all. To this Boltzmann replied in the
same journal, Annalen der Physik, that entropy was not simply mechanical but also statistical.

At this point, Planck continued the debate by asking if probability alone can determine the
direction in which a system develops. He answered in the negative. He wrote “probability
calculus can serve if nothing is known in advance, to determine the most probable state. But it
cannot serve if an improbable initial state is given, to compute the following state. That is
determined not by probability but by mechanics”(in Kuhn,1978:27). However, Planck disagreed
with Zermelo’s contention that the entropy law as a natural law is really incompatible with every
mechanical interpretation of nature. Planck argued that if one passes from discrete point masses,
such as molecules, to mechanical continuum such as an electromagnetic field, where every part
is tied to every other part like an organic whole, then “a strict mechanical significance can be
found for the second law” (ibid.). Initially Planck was opposed to Boltzmann’s molecular proof
of irreversibility. He believed that like energy, entropy had to be determined not only by the
macroscopic (thermodynamic) state of the system but also by the underlined microscopic
(mechanical) state. His problem was to find a proper micro model, not one based on discrete
mass points of molecular gas theory but on continuous matter like electromagnetic ether. In his
programme (1897), he claimed that the reversible equations of electromagnetic wave theory did
explain thermodynamic irreversibility.

(B) Electromagnetism and Thermodynamics—Planck

To Planck, a derivation of the second law meant a derivation of irreversibility which he thought
he could derive by combining thermodynamics and electromagnetism. The locus for such work
was provided by black-body radiation on which he started work from 1897 onwards. His main
aim was to show that reversible equations of Maxwell’s electromagnetic wave theory could be
used to explain irreversible thermodynamic processes. Initially he put forward the model of a
single Hertzian resonator in a cavity which he later changed to a model with several resonators at
one frequency enclosed in a cavity interacting with the electromagnetic field. If an arbitrary
initial distribution of energy is injected into the cavity, then the resonators in the cavity would
induce a strictly irreversible evolution of radiation from an incident plane wave to an outgoing
spherical wave tending towards spatial homogeneity, isotropy and polychromy. The distribution
will move towards equilibrium and entropy would increase until equilibrium is achieved. If one
had a formula for the entropy of radiation as a function of the field variables than black-body
distribution function would be the one that maximized the total entropy of the radiation in the
cavity. For him, a byproduct of this study would be the derivation of the law of spectral energy
distribution of thermal energy radiation. He thought he could thus retrieve irreversibility in the
form of electromagnetic equations of the field in the empty cavity.

Boltzmann replied that such a miracle could not be performed. The equations of electrodynamics
like those of mechanics are invariant under time reversal. All processes that satisfy them can run
in either direction and are thus reversible. Any irreversibility in the effect of resonators that
Planck finds, Boltzmann commented, derives from his choice of uni-directional initial
conditions. Planck conceded Boltzmann’s point and recognized the crucial importance of
suitable initial conditions in non-statistical proofs of irreversibility.

In 1898, Planck introduced a special hypothesis for radiation theory, as Boltzmann had for gas
theory, that would prohibit some initial and boundary conditions for change to proceed
irreversibly. As a physical hypothesis about the distribution of micro states, the role of ‘natural
radiation’ is to permit a derivation and a definition of the probability of a state: the states must
be so chosen that they provide the probability demanded by the selection of special initial
conditions that preserve the absolute validity of the entropy law. The hypothesis, governing the
distribution of initial conditions within individual intervals or regions, determines combinatorial
probability and thus entropy: in fact prohibiting those configurations which violate the second
law becomes a means of fixing the relative probability of the states that remain.
Some have read these special hypotheses as a stipulation of randomness (e.g., Stephen Brush,
1976). But Kuhn writes that neither Planck nor Boltzmann’s other contemporaries equated
molecular disorder with randomness. Planck’s ‘special assumption’ does not demand that certain
actual configurations of the molecules within individual cells be improbable, “but rather that they
never occur at all, either initially or as the motion proceeds” (1978:67). In this way, Darrigol
(1988) argues that Planck’s generic notion of ‘elementary disorder’ covering both molecular
chaos and natural radiation, provides the precondition and also the strict guarantee for the
validity of the second law of thermodynamics to which he was firmly attached. Thus, Planck
reinterpreted Boltzmann’s combinatorial definition of entropy and probability in a way
compatible with the notion of absolute entropy increase. This conveys Planck’s struggle in
retaining absoluteness of the entropy principle along with its probabilistic definition of entropy.
It indicates how closely both Planck and Boltzmann shared the problem of demonstrating
irreversibility. Both started out by seeking a deterministic proof of irreversibility; both were
forced to settle for a statistical proof eventually.

The key point of Planck’s derivation (in its 1901) form, in which he is said to have introduced
energy quantization, was his adoption of Boltzmann’s atomic-statistical model in toto despite his
opposition to him. However, the inner difficulties of Boltzmann’s gas theory came to light when
Planck made an attempt to transfer it to radiation theory. In subdividing the energy continuum
into cells or elements of size e, Planck was following Boltzmann who had introduced it (first in
1872) as a mathematical device when presenting a probabilistic derivation of the entropy and
velocity distribution of a gas. In his derivation, the precise size of e made no difference. But in
Planck’s derivation, the cell size had to fixed in proportion to resonator frequency with h as the
constant of proportionality. This restriction puzzled him. But it was for him a restriction on cell
size, not on resonator energy and it did not, therefore, bring anything like energy quantization to
mind. The energy elements did not express an intrinsic discontinuity of resonator energy; they
had no relation to Einstein’s later light-quanta. Yet after introducing this novelty in his
derivation, Planck himself does not identify it as the fundamental novelty of his theory. Rather
the innovation that he thought he had introduced was the constant h (later known as Planck’s
constant) as a catalyst to perform to harmonious unification of statistical thermodynamics and
classical electrodynamics.

Planck’s attempts to resist micro-(atomic) models, the statistical interpretation of irreversibility
and the discontinuous structure of radiation, were all unsuccessful and he gradually realized it. In
a lecture given in 1914, Planck finally declared, “Now, no doubt is left the physicist bound by
inductive demonstration: matter is made of atoms, heat is agitation of molecules and the
conduction of heat, like every irreversible process, obeys not dynamical but statistical laws, that
is to say probability laws…. On an individual process, the second principle can state something
with certainty only to the extent that one already knows that the course of this particular process
is not substantially different from the average course of a great number of processes that all start
from the initial state”. In a later talk, (1926), Planck was quite explicit on this point, “The second
law of thermodynamics loses its status of principle; it survives only as a statistical law that is
valid not for the properties of an individual system but only for the mean values of a great
number of identical macroscopic exemplars of the system” (in Darrigol, 1990: 279).
Kuhn’s major interest in reconstructing the history of quantum innovation is to show that even
after he arrived at his quantum hypothesis, Planck still saw it as a direct extension of classical
physics. It was others, mainly Einstein and Ehrenfest, who recognized the revolutionary nature of
Planck’s derivation that it required the restriction of energy to integral multiples of energy
element hv, and who were convinced that no classical (mechanical or electromagnetic) model of
black-body radiation could succeed. So while Kuhn (1978,1984) claims that Einstein should be
credited with the quantum discovery for consciously making the innovation, it is Planck, who
was actually unaware of the nature of innovation he had unknowingly introduced, to whom the
innovation is imputed and after whom the constant, quantum of action h, and the black-body
radiation law are named. More radical theories than Planck’s (Einstein, Ehrenfest) and more
conservative (read classical) theories than Planck’s (Jeans, Lorentz) brought out the full potential
of a theory, scarcely enunciated by its author. It was in this cooperative and competitive
enterprise that Plank unknowingly produced a fundamental innovation in scientific thought.
Thus, scientific work, like any other work requires cooperation and team-work in the production
of new theories and techniques, yet recognition and reward accrues only to an individual and not
to the team. This is the fundamental ‘social fact’ of the social organization of the scientific
profession.

Kuhn interprets Planck’s concern for a dynamical proof of irreversibility as not being central to
the actual derivation of the new black-body radiation law which he brought out in December
1900 and for which he is remembered. He recognizes that a dynamical versus a statistical proof
of irreversibility was significant in Planck’s work but writes that what provided the actual
impetus to the radiation law were the new radiation experiments which invalidated other older
laws like that of Wien, Jeans and Rayleigh and simultaneously provided the impulse for Planck
to work out a new radiation function (1978: 91-92). There is no doubt that experimental
confirmation is necessary to any scientific outcome but to see it as the fundamental basis to
scientific innovations will not help us to discover and define the logic of the conditions of its
existence. Moreover, if one were to read Planck’s innovation of the quantum by placing it in the
matrix of three theoretical frameworks, one is convinced immediately of the centrality of the
question of thermodynamic irreversibility versus reversibility in the emergence of the quantum.

(C) Mechanics and Electromagnetism—Einstein

After working on the relation between macroscopic thermodynamic quantities and molecular
structures and providing a foundation to thermodynamics on the joint basis of the equations of
mechanics and the theory of probability, the problem of black-body radiation stuck Einstein as
an illustration of the need to unify the foundations, this time between discrete particles of
Newtonian mechanics and continuous field of Maxwell’s electromagnetic theory. In his 1905
paper, “On a heuristic viewpoint concerning the emission and absorption of light”, he proposed
that Maxwell’s electromagnetic wave theory might not be the last word on the subject and one
should explore the idea that light behaves like a collection of independent, localized particles of
energy – the light quanta. He proceeded from Wien’s idea of 1900 that electromagnetic waves of
short and long wavelength differ qualitatively as well as quantitatively from each other. The long
wavelength radiation could be described by the known laws of electrodynamics but short
wavelengths required different laws for explanation. Considering only this short wavelength
form of the radiation spectrum, Einstein showed purely thermodynamically that the entropy of
black-body radiation in a given wavelength interval depends upon the volume of the enclosure in
exactly the same way that the entropy of an ideal gas depends upon its volume. By interpreting
entropy statistically, he recognized that it was simply the independence of the motions of the gas
molecules that produced a particular form for its entropy. His next step was to make a logical
leap: if the entropy of radiation has the same form as that of a gas and if the entropy of a gas has
that form because it consists of independent particles, then radiation too must consist of
independent particles of energy. Einstein, however, argued for more than the necessity of
introducing discontinuities into black-body theory. The concepts of light-quanta and of resonator
energy restriction to hv had entered together in Einstein’s papers of 1905 and 1906 and they
remained for him parts of a single if unfinished theory. Since the first was abhorrent even to
those scientists persuaded of the second’s necessity, disentangling the two or finding a substitute
for both was to be a central task in the further development of the quantum programme.


 Generative Discourse And Scientific Logic:
Innovation Within Limits
In the above section, I tried to argue that the black-body problematic should be located in a
matrix of competing and opposing paradigms. In Kuhn’s own example of the radiation case, its
history clearly reveals (a) the simultaneous presence of opposite and competing paradigms and
(b) the coexistence of rival conceptual structures, as not just a historical contingency but a logical
necessity, for the articulation as well as the ‘resolution’ of the radiation problem. The articulation
of the tensions between rival theories is a necessary condition for the production of a critical
discourse. In this way one can see that the rationality of science is to be understood not only
historically but also logically. This also explains the uniqueness of the radiation problem in the
development of quantum physics. For many commentators have argued that the specific heats of
solids at low temperature, a long standing problem in physics and chemistry, could also have
provided the impetus for a change in the foundations of physics. But the logical and historical
validity of the cavity radiation case as a case involving the three basic conceptual foundations of
physics, as I have outlined above, makes it the central factor in the emergence of the quantum
innovation. Moreover, this is also the reason why one can’t read the quantum story in a linear
way where the black-body radiation case could appear as an ‘anomaly’ or a puzzle in the
Kuhnian sense of ‘normal science’, with the above controversy signaling a period of
‘revolutionary science’ and the subsequent quantum innovation as a ‘paradigm shift’ in the
history of science. To think of something as an anomaly presumes that it is a divergence or a
deviation from the norm. But the pattern of scientific appraisal sketched above reveals not the
presence of one norm, from which infractions could be possible, but three fundamental structures
of thought. What it shows is the effort of the scientific community to place it in a variety of
conceptual frameworks within which the black-body distribution function could be obtained. But
this does not indicate in Feyerabend’s terminology the proliferation of incommensurably
different conceptual structures. In this controversy because all the three theoretical structures
were dominant or neither was dominant, the participants focused their appraisal on comparisons
between the explanatory scope of rival theories and outcomes. Therefore, the choice is not
between a single monolithic conceptual framework (Kuhn) or an anarchistic welter of
incommensurable conceptual frameworks or practices (Feyerabend, social constructivism) but to
understand the logic of a scientific situation in terms of dualisms, oppositions and contrasts
which form the conditions of existence for the generation of any innovation.

Through these controversies I have also attempted to show that if one were to look outside the
textbook tradition, then one would rarely find a completely unchallenged paradigm of scientific
research. Recourse to the body of journal literature, the medium through which natural scientists
report their original work, assess and evaluate that done by others and often modify their own
work, immediately casts doubt upon Kuhn’s implications of the standard textbook schema of
scientific progress in terms of paradigms and revolutions. The black-body radiation case shows
that all the scientists like Boltzmann, Planck or Einstein were reporting their progress on the
case, their findings on the subject and their dissension with their colleagues on the matter, by
sending their research publications to various journals. The great diversity of conceptual
frameworks encountered, and the meaningful dialogue between them, in the various journals of
the time forms the basis of my critique of the Kuhnian argument for the existence of unequivocal
paradigms as a hallmark of a mature science. (Kuhn’s main point is that work within a well-
defined paradigm is more productive of revolutionary episodes of scientific innovations, than
work in which no similarly convergent standards are involved.) Of course it is possible that the
competing paradigm does not enjoy equal prestige; one of them may be projected as the official
paradigm, the orthodoxy, which is endorsed by the textbooks of the time and the other is
formulated as a dissent or a challenge to the orthodoxy that does not find any mention in the
textbooks, but is reported in the journals. Thus, in spite of limits put by the orthodoxy to forms of
thought that can be discussed freely and legitimately, they nevertheless don’t disappear
completely; they may just go underground, only to appear again. This is because the textbooks
partake of the ideal of uniformity or homogeneity to which modern Western science aspires and
thus ignore alternative and oppositional tendencies. In contrast to Kuhn’s claim that modern
natural sciences (unlike the arts and the social sciences) aspire to an ideal of uniformity , Cassirer
writes (1956) that the unity of natural knowledge does not demand any such uniformity. He
argues that “all scientific thought is dominated and guided by two opposing tendencies that are
engaged in a continual process of mutual adjustment. The demand of ‘specification’ is the
counterpoise to the claim of ‘homogeneity’. The struggle between these two cannot be decided
purely objectively from the nature of the object. It is a dissention and competition that belongs
not so much to the nature of things as to scientific region itself. In this sense, homogeneity and
specification were introduced into Kant’s Critique of Reason, not as constitutive principles,
pertaining to the knowledge of objects but as regulated principles, as maxims of scientific
inquiry” (1956: 80).

Within the scientific discourse, competing theories give conceptual expression to diverse and
opposing physical phenomena and thus arises the difficult and paradoxical task of permeating
each with the others in a dialectical way and so rendering them complementary to each other. We
saw, for instance, how, Boltzmann’s explanation of macroscopic thermodynamic irreversibility
was conceptualized in the model of microscopic mechanical reversibility of molecular motions;
Planck used Boltzmann’s statistical definition of entropy in gas theory to justify a dynamical
interpretation of radiation entropy and derive the radiation law, with Boltzmann’s constant k
forming the link between the two; and finally, following Wien’s parallel between the radiation
pressure and the pressure of a Maxwellian gas, Einstein treated radiation itself as a gas of
independent particles of energy (quantized). All the scientists recognized that radiation theory
and gas theory were different and yet found it necessary to model one on the other, with Einstein
finally transforming the difference into a resemblance. As Deleuze writes that “the question
‘What difference is there?’ may always be transformed into: ‘What resemblance is there?’”
(1994 : 12).

The debate, outlined above, exemplifies Kuhn’s point that a scientific theory is declared invalid
only when an alternative candidate is available, contrary to Popper’s claim (1974) that scientific
developments proceed by “falsification” of theory by direct comparison with nature and to
Lakatos’ “sophisticated methodological falsifications” where modifications to a “research
programme” arise by comparing the revised version with a previous stage in the development of
the selfsame programme (1970:118). Kuhn writes, “the decision to reject one paradigm is always
simultaneously the decision to accept another and the judgement leading to that decision
involves the comparison of both paradigms with nature and with each other” (1970:77). But
while for Kuhn this occurs only during relatively infrequent periods of crisis, for Feyerabend
(1970) the proliferation of alternatives to the dominant view goes on all the time in science.
However, despite this difference both accounts fail to provide an intelligible principle of motion
of scientific change. The major weakness in Kuhn’s theory of scientific revolutions is his failure
to explain why and how the presence of anomalies (which are always there) will sometimes
precipitate a crisis for a paradigm and usher in a revolution. Similarly, in Feyerabend’s theory,
given the proliferation of alternatives (all the time), he is unable to explain why and what
changes during discontinuous paradigm shifts.

My reading of the black-body radiation case ends here. It was an attempt to understand the
significance of the cavity radiation case in the emergence of quantum physics and to outline the
conditions of its existence in the light of current theories of history of science. Scientific
discourse, like language, is not just referential; it also exists as a kind of lateral message
indicating its own process of formation. Paradigm, construction, research programme, discourse
are all different metaphors which help us to understand its formative, reflective and critical
character. The dynamism of scientific discourse is such that the universe of discourse becomes at
the same time a discourse of the universe.

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:2
posted:4/7/2012
language:
pages:12