01 Nov 2018 03:00 PM - 05:00 PM(America/Vancouver)
20181101T150020181101T1700America/VancouverColossi with Feet of Clay (*) – Stable Theories and Fragile Foundations
Many of our best scientific theories exhibit the common feature that their wide acceptation or use, thus their stability, are paralleled by a persistent and sometimes growing dissatisfaction with their foundations. The most conspicuous case is the 90-years old controversy on the good foundations of the successful quantum. However, quantum theory is not alone. It was also the case, for instance, with Newton’s mechanics and absolute time and space, differential calculus and its foundation, the role of the elusive ether in the propagation of the electromagnetic phenomena, the explanation of the mechanism for the Darwinian natural selection principle, the mechanic foundation of the second law of thermodynamics, and the explanation for Wegener’s tectonic plates. Thinking about this coexistence of stable scientific theories with their uncertain foundations may shed new light on the public image of science and illuminate science’s strengths as derived from their historical constitution and not from some clear cut or axiomatic foundations or even from once desired unified science. These reflections require both historical studies and philosophical investigations thus contributing to filling the contemporary gap between these two fields. This session brings together historical case analysis and philosophical reflections about the coexistence, not always peaceful, between successful scientific theories and their disputed foundations. (*) This metaphor was used by Franck Laloë in his book “Do we really understand quantum mechanics?”
Organized by Olival Freire Junior (Universidade Federal da Bahia, Brazil)
Commentator – Alexei Kojevnikov (UBC, Canada)
Boren, Fourth FloorHistory of Science Society 2018meeting@hssonline.org
Many of our best scientific theories exhibit the common feature that their wide acceptation or use, thus their stability, are paralleled by a persistent and sometimes growing dissatisfaction with their foundations. The most conspicuous case is the 90-years old controversy on the good foundations of the successful quantum. However, quantum theory is not alone. It was also the case, for instance, with Newton’s mechanics and absolute time and space, differential calculus and its foundation, the role of the elusive ether in the propagation of the electromagnetic phenomena, the explanation of the mechanism for the Darwinian natural selection principle, the mechanic foundation of the second law of thermodynamics, and the explanation for Wegener’s tectonic plates. Thinking about this coexistence of stable scientific theories with their uncertain foundations may shed new light on the public image of science and illuminate science’s strengths as derived from their historical constitution and not from some clear cut or axiomatic foundations or even from once desired unified science. These reflections require both historical studies and philosophical investigations thus contributing to filling the contemporary gap between these two fields. This session brings together historical case analysis and philosophical reflections about the coexistence, not always peaceful, between successful scientific theories and their disputed foundations. (*) This metaphor was used by Franck Laloë in his book “Do we really understand quantum mechanics?”
Organized by Olival Freire Junior (Universidade Federal da Bahia, Brazil)
Commentator – Alexei Kojevnikov (UBC, Canada)
The Uncertain Foundations of the Renormalization Program: Attitudes towards Quantum Electrodynamics in the 1950sView Abstract Part of Organized SessionPhysical Sciences03:00 PM - 03:24 PM (America/Vancouver) 2018/11/01 22:00:00 UTC - 2018/11/01 22:24:00 UTC
Physicists often praise quantum electrodynamics (QED) as “the most precise scientific theory ever constructed”. Its calculations depend on a technique called renormalization, which was developed circa 1947 by Hans Bethe, Sin-Itiro Tomonaga, Julian Schwinger, and Richard Feynman. That technique managed to eliminate the divergent quantities that had plagued QED’s calculations in the 1930s and 1940s. Already in the late 1940s, the renormalization program found important allies, such as Wolfgang Pauli, Léon Rosenfeld, and Freeman Dyson. A new generation was educated in the United States in the early 1950s learning that QED was no longer a problem, and that they should then approach the other fundamental interactions, namely, the gravitational and nuclear ones. Following this narrative, several historians of science—Silvan Schweber, Jagdish Mehra, Alexander Rueger, among others—claimed that 1947 was a watershed in the history of QED, when the old problems were finally solved. In this talk, I discuss whether that narrative is adequate. I analyze some discontents of the renormalization program, namely, Rudolf Haag, Fritz Bopp, Irving Segal, and Arthur Wightman. They believed that the renormalization methods had questionable foundations and were, to some of them, plain nonsense. I also discuss Gunnar Källén’s position, who was a supporter of the renormalization program and, nevertheless, an opponent of Schwinger’s methods. I claim that QED was far from being considered a solved problem outside a limited circle of physicists in the 1950s, and that the standard narrative aligned perhaps too much with Schwinger’s and Feynman’s own perspectives.
Thiago Hartz Universidade Federal Do Rio De Janeiro
Securing the Foundations of Theories without Physical Postulates: The Case of Quantum MechanicsView Abstract Part of Organized SessionPhysical Sciences03:24 PM - 03:48 PM (America/Vancouver) 2018/11/01 22:24:00 UTC - 2018/11/01 22:48:00 UTC
The ideal strategy to secure the foundations of an empirically successful theory is to provide physical postulates on which it can then be unambiguously reconstructed. Identifying such postulates, especially when one wants them to be indisputable, may however prove difficult. Such is notoriously the case of quantum mechanics. What are then the alternative strategies to provide nonetheless the theory with some kind of foundational legitimacy? When its formalism is mature enough, one can try to supplement to foundational rigor showing that the mathematical structure underlying the theory is in some sense “necessary”. To reach this aim, one puts requirements on how the physical systems or situations are to be formally handled. One attempts then to show that the formalism of the theory is a solution, hopefully unique, of the latter. To achieve the proper sense of necessity and to avoid ad hoc justification, the requirements have to be as general as possible. On the other hand, they also have to be “natural” and no wonder that eventually the frontier between physical postulates and such formal requirements gets blurred.
This strategy has been used in the various axiomatizations of quantum mechanics. The talk will examine the rise of such approaches in the history of quantum theory and the (sometimes heated) debates that the latter prompted. Special attention will be given to the rise of the so-called Geneva School.
From Light Quanta to Bosons: Conceptual Foundations and Interpretive FlexibilityView Abstract Part of Organized SessionPhysical Sciences03:48 PM - 04:12 PM (America/Vancouver) 2018/11/01 22:48:00 UTC - 2018/11/01 23:12:00 UTC
When the Bose-Einstein and the Fermi-Dirac statistics were first formulated and explored, their conceptual foundations raised more questions than the formal apparatus of the theories could answer. The interpretive flexibility of the theories, however, did not deter physicists from probing their applicability to various physical systems and integrating them into networks of practice. It was only in the following two decades, through the tumultuous developments of the 1930s and early 1940s, that a unified interpretation was formulated, which viewed both quantum statistics as consequences of a radical break from the classical conception of radiation and matter. After a brief survey of the interpretative diversity of the early period (1924-1926), some reformulations and uses of the quantum statistics in the 1927-1946 period, for example by George Uhlenbeck, Ralph Fowler, Fritz London, Erwin Schrödinger, and Paul Dirac, are examined, with a focus on the role played by local contexts and traditions of theoretical practice in the eventual emergence of the new foundational categories of “bosons” and “fermions”.
The Discovery of RNA Splicing as a Surprise: Stability of the DNA-Protein Co-Linearity Theory or Faulty Foundations of Biological Diversity?View Abstract Part of Organized SessionPhysical Sciences04:12 PM - 04:36 PM (America/Vancouver) 2018/11/01 23:12:00 UTC - 2018/11/01 23:36:00 UTC
The discovery of RNA splicing in 1977 is widely considered to be a turning point in molecular biology, often viewed as the starting point of the RNA revolution. By showing that many eukaryotic messenger-RNAs are not co-linear with DNA but rather are the products of multiple splicings of non-contiguous segments of a primary transcript of the genome, the discovery led to a new paradigm of genetic regulation. The theory of co-linearity, established in the early 1960s, became so entrenched by the mid- and late-1970s that it prevented the most advanced labs from interpreting accumulating evidence in favor of "splicing." The talk examines whether the stability of the co-linear theory or paradigm was accompanied by faulty foundations such as the assumption that eukaryotes are no different in their life mechanisms than prokaryotes. If so, why some labs (e.g. J. Darnell's at Rockefeller University) could not give up the belief in the strong co-linear theory, thus missing a discovery that of RNA "splicing", or a discovery that is viewed as the third most important one in molecular biology? (i.e. after DNA structure and m-RNA function) while others (e.g. Cold Spring Harbor and MIT) labs were able to abandon the co-linearity theory, perhaps because they were more aware of the faulty foundations of a world view with limited possibility for genomic diversity. The talk further compares and contrasts the leading contender labs in terms of their affinity to stable theories, faulty foundations, new experimental opportunities, social composition, and mentorship by leading scientists.
The Triumph of Bohr: The Folk History of Quantum Mechanics and the Tension between a Stable Theory and its Fragile FoundationsView Abstract Part of Organized SessionPhysical Sciences04:36 PM - 05:00 PM (America/Vancouver) 2018/11/01 23:36:00 UTC - 2018/11/02 00:00:00 UTC
There is a "folk history" of quantum physics within the community of physicists, one that bears little resemblance to the history of the field. According to this folk history, there is a single orthodox "Copenhagen interpretation" which solves or dissolves all of the questions at the foundations of quantum mechanics. This interpretation has been in existence since the Bohr-Einstein debates of 1927, if not before. Moreover, the folk history goes on to claim that Bohr successfully dismissed Einstein's challenges at every turn, and the discovery of Bell's theorem three decades later only solidified Bohr's triumph. This folk history falls apart upon even cursory examination: there is no single coherent position known as the Copenhagen interpretation, nor has there ever been one. And none of the positions that go by the name “Copenhagen interpretation” do a good job of solving the measurement problem, the central interpretive problem at the heart of quantum foundations. Nor do they evade the nonlocality that is dictated by Bell’s theorem—nonlocality that was first pointed out by Einstein, and that was ignored by Bohr's followers. Yet this folk history is still common knowledge among physicists, likely because it serves an important psychological function: it allows physicists to ignore the troubled foundations of quantum mechanics so they can get on with using the (phenomenally powerful) theory. In this talk, I will examine the origins of the folk history, the documentary evidence that belies it, and some of the effects of the folk history's persistence within the field.