
THE MATHEMATICS OF ARCHIMEDES DISCOVERED IN QUANTUM MECHANICS
In a groundbreaking set of two papers recently published in Foundations of Physics, CPNS Senior Research Fellow Elias Zafiris demonstrates how the mathematics of Archimedes can be used to better understand the nature of quantum topological phases. This work opens important new doors in quantum theory, with astounding practical applications.
Building on his ideas developed in his book with Michael Epperson, Foundations of Relational Realism: A Topological Approach to Quantum Mechanics, these new papers exemplify the potential value to modern phyiscs of historically reexamining (and perhaps rehabilitating) ancient Hellenic ideas about the relationship between mathematics and nature.

The reality of quantum potential states
Science News editor Tom Siegfried recently reviewed a paper by CPNS Senior Fellows Stuart Kauffman, Michael Epperson, and Ruth Kastner"Taking Heisenberg's Potentia Seriously" International Journal of Quantum Foundations, 4:2 (2018): 158172.
The central thesis of this paper is that quantum theory exemplifies an ontological duality of potential and actual states, a duality first introduced by Aristotle and later rehabilitated by Heisenberg. Each of the authors has developed this thesis individually in their own publications, including proposals that are fully formalized physically and mathematically. These are referenced on the CPNS Research Page.
Siegfried writes: "In the new paper, three scientists argue that including “potential” things on the list of “real” things can avoid the counterintuitive conundrums that quantum physics poses... At its root, the new idea holds that the common conception of “reality” is too limited. By expanding the definition of reality, the quantum’s mysteries disappear. In particular, “real” should not be restricted to “actual” objects or events in spacetime. Reality ought also be assigned to certain possibilities, or “potential” realities, that have not yet become “actual.” These potential realities do not exist in spacetime, but nevertheless are “ontological” — that is, real components of existence. 'This new ontological picture requires that we expand our concept of ‘what is real’ to include an extraspatiotemporal domain of quantum possibility,' write Ruth Kastner, Stuart Kauffman and Michael Epperson."
Click here for the full article.

a welcome return to 'old school' natural philosophy
In their book Iconicity and Abduction (Volume 29 of Springer's Studies in Applied Philosophy, Epistemology, and Rational Ethics Series) Gianluca Caterina and Rocco Gangle rigorously explore the ways in which the relational structures of the natural world, when formalized category theoretically, illuminate the relational structures (such as abductive inference) by which we attempt to understand the world.
"The book's thesis," write Caterina and Gangle, "is that the core method of category theory, which lifts properties characterizing individual objects to structural properties characterizing systems of relations linking individuals to one another, helps to illuminate the creative, contextdependent and tentative nature of abductive inference. In particular, category theory sheds new light on how and why mathematics itself is so often successfully employed in scientific hypotheses and their experimental testing.
"...We examine iconicity as a semiotic structure linking conceptual fields organized by logical relations on the one hand and real domains organized by structural and causal ones on the other such that the type of linkage itself thereby provides important clues as to the possibility and internal functioning of abductive inference within processes of scientific theory construction. In particular, we aim to show that the semiotics of iconicity as realized in the logic of presheaves and sheaves and rigorously formulated in the language of categorical topoi goes quite some distance in explaining the necessity and the success of the employment of mathematical tools in the scientific investigation of nature."
Among the categorysheaf theoretic frameworks they discuss is the relational realist, topological approach to quantum mechanics formalized by CPNS research fellows Elias Zafiris and Michael Epperson. Caterina and Gangle write, "An approach to quantum theory via sheaves and topoi very much akin to Flori’s in its formal dimension has been developed by Epperson and Zafiris in Foundations of Relational Realism: A Topological Approach to Quantum Mechanics and the Philosophy of Nature.
"For Epperson and Zafiris, a reconstruction within sheaflogic and topoi of quantum mechanics very similar to Flori’s supports a somewhat more controversial “relational realism” that amounts to a revisionary Whiteheadian metaphysics in which concrete, continuous spatiotemporal event structures are themselves specifications or contractions of more general and fundamental logical/mereotopological algebras as represented categorically by lattices. The relation between the two levels is represented in turn also categorically by presheaves and furthermore the functorial lifting of presheaves to sheaves then corresponds to the quantum mechanical passage from pure superpositions of states to calculable probabilities of observable outcomes of measurements...
" ...What is most important is the way that this relational ontology is at once grounded in and derived from the recasting of quantum physics in a topostheoretical framework. In other words, what is salient here is the way that the overall claim of Epperson and Zafiris regarding the consistency of a Whiteheadian ontological interpretation of quantum theory is itself produced abductively on the basis of the internally relational or “arrowsonly” orientation of category theory and topoi. The mathematics employed here is not only formally representational, but suggestively generative of ontological interpretations that play the role, from this perspective, of more or less likely hypotheses. It is thus not simply a matter of producing arbitrary theories or interpretations as constrained merely to be compatible with observational and theoretical data, but of using the iconic characteristics of such data to determine a space of motivated “hypotheses” that conform iconically to what must be explained or understood."

Angelaki Publishes Special Issue Presenting the Work of the Ontogenesis Process Group
OPG is a research stream animated by the thesis that the living world in all its modes—biological, semiotic, economic, affective, social, etc.—escapes finite schema of description. Our work is based on a deep and sustained engagement with biological, physical, and computational sciences, operating in conjunction with anthropological, philosophical, and artistic modes of inquiry.
Sha Xin Wei, Arts, Media & Engineering, Arizona State University; Stuart Kauffman, Complexity Theory, Santa Fe; Giuseppe Longo, Mathematics & Theoretical Biology, École Normale Supérieure; Philip Thurtle, Comparative History of Ideas, University of Washington; Michael Epperson, History and Philosophy of Science, CSUSCPNS; Cary Wolfe, Cultural Theory, Rice University; Gaymon Bennett, Religious Studies, ASU; Adam Nocek, Arts, Media & Engineering, ASU; Erin Espelie, Film Studies & Critical Media Practices, University of Colorado Boulder, Editorin Chief, Natural History magazine.
[Abridged excerpt from the Editorial Introduction by Cary Wolfe & Adam Nocek]
This special issue is the result of multiple years of collaboration among a diverse group of scholars, scientists, and practitioners. This group came to be known as the Ontogenetics Process Group (OPG). From the outset, what distinguished this unruly collective seemed to be a shared nostalgia for an intellectual space where scientists, humanists, and artists could engage in theoretical exchange without the pressure of superficial “outputs” to satisfy administrators, mixed with an insatiable hunger for the formation of an interdisciplinary conceptual frame capable of responding to pressing questions emerging not just from biological and computational systems, but also from the domains of social and cultural practice.
This kind of interdisciplinary theoretical endeavor has a long history, of course, one that, in recent decades, has been largely occluded by the rise of Big Data, the neoDarwinian paradigm and its obsession with the genome as an engineerable “book of life,” and the assumption that “hard,” tenurable scientific knowledge is fundamentally quantitative in nature. It’s entirely possible, however, that that hegemony will, in the longer view, prove to be misguided or, at the very least, oversold, and in that longer view, the OPG might be situated genealogically somewhere between the intellectual investments of the Theoretical Biology Club in the 1930s (organicism), the interdisciplinary ambitions of the Macy Conferences in the 1940s and 1950s (that included figures as diverse as Warren McCulloch, Gregory Bateson, and Margaret Mead), and the restless gamechanging and institutionbuilding work of the Santa Fe Institute in the 1980s and 1990s. From the very beginning, Stuart Kauffman, one of the original members of Santa Fe Institute, and a founding member of the OPG, would often remark (and we’re paraphrasing), “there is really something here that the complexity scientists over there at the Institute won’t be able to get their heads around.”
What Kauffman is referring to is the fundamental challenge that OPG researchers pose to what has become the lingua franca of theoretical biology: complex systems theory, on a quantitative and mathematical template. For all the descriptive and predictive power that the complexity sciences offer (the ability to compute feedback systems, recursive networks, emergent dynamics, etc.), they also presume that the living world in all of its modalities (biological, semiotic, economic, affective, social) can be reduced to finite schema of description that delimits in advance all possible outcomes. The mathematics of complexity function like a “grid of intelligibility” for physicists, biologists, economists, information scientists, sociologists, and now many humanists; they permit the sciences of the living and nonliving to speak the same language. What distinguishes this group of researchers, and this special issue of Angelaki in particular, is the breadth of disciplinary and methodological frameworks brought to bear on the possibilities and limitations of this proposition. More than this, what is proposed here are conceptual architectures for the living that are not only irreducible to physicomathematical frames of reference but that are also as vital as the phenomena they wish to express. In short: life is more complex than complexity.
Rarely, if ever, do we see an information scientist, a complexity theorist, a design and organizational theorist, a mathematician, a historian of science, an experimental filmmaker, an anthropologist of science and religion, a philosopher of physics, and a couple of theoretical humanists assemble in order to contemplate modes of living that are more complex than complexity. At the heart of this shared inquiry is a deep and sustained interest in biology, in questions of selforganization, morphogenesis, epigenetics, cultural inheritance systems (soft inheritance), downward and distributed causation, as well as the implications of quantum physics in these domains. But in taking these questions on board, especially in light of the work Longo and Kauffman have done on the limitations of complex systems science, two things become startlingly clear: (1) that cultural, political, and economic systems cannot be isolated from the physicochemical emergence of living phenomena; and (2) that the reigning models of complexity need to be paired with noncomputational and nonalgorithmic modes of inquiry in order to better express the unfolding of living worlds. And yet, just what relevance these extrabiological systems have and what modes of (nonalgorithmic) inquiry are most appropriate (ethnography, mathematics, conceptual art, philosophy, speculative design) are not agreed upon and remain open for debate.
This lack of agreement should not be treated as a limitation, however. Where other anthologies, volumes, or working groups would demand a clear path forward, and might even insist upon formulating a “new science” out of the nonalgorithmic study of the living, we maintain that this is precisely the style of thinking that leads to the metaphysics of life that we aim to critique. We therefore see the radical plurality of views, which do not always sit comfortably together, as a strength that forcefully demonstrates the resistance of the living to metaphysical capture.

THE PROBLEM OF MATHEMATICAL REDUCTIONISM IN PHYSICS
CPNS Research Fellow Michael Epperson critiques a familiar maneuver in popular physics books these days—claims of concretizing what is inescapably abstract, usually by way of a purely speculative and untestable assertion costumed mathematically as a testable hypothesis.
[Excerpt from IAI News. Click here for the full essay.]
"The Creative Universe"
by Michael Epperson
When celebrity physicists disagree about some fundamental prediction or hypothesis, there’s often a goofy and wellpublicized wager to reassure us that everything is under control. Stephen Hawking bets Kip Thorne a oneyear subscription to Penthouse that Cygnus X1 is not a black hole; Hawking and Thorne team up and bet John Preskill a baseball encyclopedia that quantum mechanics would need to be modified to be compatible with black holes. Et cetera, et cetera. And even as we roll our eyes, we’re grateful because at least some part of us does not want to see these people violently disagreeing about anything.
So when celebrity physicist Lawrence Krauss publicly called celebrity physicist David Albert a “moron” for not appreciating the significance of Krauss’s discovery of the concrete physics of nothingness, it caused quite a stir. In his book, A Universe from Nothing, Krauss argued that in the same way quantum field theory depicts the creation of particles from a region of spacetime devoid of particles (a quantum vacuum), quantum mechanics, if sufficiently generalized, could depict the creation of spacetime itself from pure nothingness. In a scathing New York Times review of Krauss’s book, Albert argued that claiming that physics could concretize “nothing” in this way was at best naïve, and at worst disingenuous. Quantum mechanics is a physical theory, operative only in a physical universe. To contort it into service as a cosmological engine that generates the physical universe from “nothing” requires that the abstract concept of “nothing” be concretized as physical so that the mechanics of quantum mechanics can function. What’s more, if quantum mechanics is functional enough to generate the universe from nothing, then it’s not really nothing; it’s nothing plus quantum mechanics.
This is a familiar maneuver in popular physics books these days—claims of concretizing what is inescapably abstract, usually by way of a purely speculative and untestable assertion costumed mathematically as a testable hypothesis. It is a cheap instrument, as attractive as it is defective, used more often as cudgel than tool for exploration. Fortunately, as we saw with David Albert, few despise its dull edge more than other physicists and mathematicians. During the first years of modern mathematical physics and the construction of its two central pillars, quantum theory and relativity theory, Alfred North Whitehead warned, “There is no more common error than to assume that, because prolonged and accurate mathematical calculations have been made, the application of the result to some fact of nature is absolutely certain.”
Whitehead would later generalize this error as the “fallacy of misplaced concreteness.” It is often oversimplified as merely mistaking an abstract conceptual object, like a mathematical or logical structure (e.g., the number zero, or the concept of “nothingness”), for a concrete physical object. But the fallacy has more to do with what Whitehead argued was the chief error in science and philosophy: dogmatic overstatement. We commit the fallacy of misplaced concreteness when we identify any object, conceptual or physical, as universally fundamental when, in fact, it only exemplifies selective categories of thought and ignores others. In modern science, the fallacy of misplaced concreteness usually takes the form of a fundamental reduction of some complex feature of nature—or even the universe itself—to some simpler framework. When that framework fails, it is replaced with a new reduction—a new misplaced concreteness, and the cycle repeats.